# Unix in a keyboard-less age



## nslay (Nov 24, 2012)

Admittedly, I'm not well versed in the details of Unix history. But I'm guessing that Unix didn't always have a tty subsystem. That it originally started with paper tape. Then it evolved to have ttys. Then it later evolved to have ptys and mouse support. What's next?

Keyboards/mice may go away in the distant future and there are already devices that lack these. So, how does a human interface with Unix now? I doubt shells/ttys are up to the task and I imagine shells will eventually be relegated to interpreting scripts than user input.

So how do you imagine human-Unix interaction in the future?

I imagine some sort of hybrid between touch screen and AI (and maybe dictation). Touch screen is horribly inefficient and to make it viable, I think you need smart software interfaces that can _predict_ user intentions.

I especially see it as an opportunity to make things right, particularly with some of those hilarious _problems_ mentioned in the UNIX-Haters Handbook (which is worth a read for laughs). For example: wildcard expansion in the shell (which won't be an issue in a keyboard-less future anymore).


----------



## zspider (Nov 24, 2012)

The day the keyboard and mouse is no more, is the day I call it quits. I'll find something else to replace the computer.


----------



## nslay (Nov 24, 2012)

zspider said:
			
		

> The day the keyboard and mouse is no more, is the day I call it quits. I'll find something else to replace the computer.



But what if there was a general and practical interface to do everything as easily (if not easier) than with a keyboard/mouse?


----------



## kpa (Nov 24, 2012)

Touchscreens are not really there yet, it's just too clumsy to write anything equivalent of a typical UNIX shell pipeline made of multiple commands on them.


----------



## nslay (Nov 24, 2012)

kpa said:
			
		

> Touchscreens are not really there yet, it's just too clumsy to write anything equivalent of a typical UNIX shell pipeline made of multiple commands on them.



I don't think touch screens will ever be efficient by themselves. I really think future software will need AI interfaces to _predict_ user intentions for touch screen to ever be efficient. This goes without saying for dictation too.

That said, I never suggested anything like a desktop interface either (as we think of desktop anyway). It must be general enough to supplant the shell and practical enough to do real work reliably.


----------



## UNIXgod (Nov 25, 2012)

nslay said:
			
		

> Admittedly, I'm not well versed in the details of Unix history. But I'm guessing that Unix didn't always have a tty subsystem. That it originally started with paper tape. Then it evolved to have ttys. Then it later evolved to have ptys and mouse support. What's next?



The earliest incarnations actually did begin with tape though later with electricity based created by Morse for electrical telegraph using Morse code. Communications can be traced back to earliest written history with homing pidgins to human messengers back to the greek times where various encryption style logic like the CÃ¦ser cipher (i.e. rot13 ) can be identified.

TTY actually was used pre-unix during the multics and ctss years as they where beginning to explore time-sharing shared resource computation. It actually stood for Teletype - teletypewriter or simply teleprinter( i.e. expensive typewriter).

Early documentation on Thompson's line editor ed() explores UNIX as if a hard copy; paper based; terminal would be used. Bill Joy's original vi() took several years to complete because backward compatibility through the creation of a terminal driver to take advantage of a "glass" terminal while being compatibly with "paper" terminals. In fact the original _vi_ has a so called 5th mode called "open mode" to do just that. The work which Joy put into the termcap library would later influence curses().

Also naming conventions for signals would also follow early computing experiences; SIGHUP stands for `Hang Up` which sends the signal that the serial connection had been severed to simulate hanging up a phone.



			
				nslay said:
			
		

> Keyboards/mice may go away in the distant future and there are already devices that lack these. So, how does a human interface with Unix now? I doubt shells/ttys are up to the task and I imagine shells will eventually be relegated to interpreting scripts than user input.
> 
> So how do you imagine human-Unix interaction in the future?
> 
> I imagine some sort of hybrid between touch screen and AI (and maybe dictation). Touch screen is horribly inefficient and to make it viable, I think you need smart software interfaces that can _predict_ user intentions.



Someone at one of my local user groups showed me that they had ported editors/vim to the ipad (iOS). As interesting as it was there was no ESC key as it was using the system level keyboard widget. As a workaround he pointed out that they had made a "shortcut" with another symbol on it's keyboard. If such a thing is important enough not to be a toy to show to your friend someone will have to create a proper UNIX layout keyboard to use in conjunct with the software.

I highly doubt anyone will be developing on these devices. Imagine writing a program by speaking to siri. Simple 4th gen interfaces could be created for end users such as we have now like "Compose message to Alyssa P. Hacker" but verbose COBOL style syntax had never really worked well for iron clad programming.

Even imagine administration (or even security) on mobile connected to server. Something as simple as chmod +x /path/to/file could be spoken as "change mode add execute home username bin program"

Now building on expansion pipes redirection and upper/lower case sigils and so forth a predetermined syntax will need to be created. With that we are looking at another dialect or another layer of abstraction. Most people who search google or web commerce sites don't know SQL. Nor should they have to. The keyboard will never really go away. It's easy to confuse the consumer market with the professional market. The problem is that computers became a commodity more of a toy than a productivity tool when they became home devices.

There will always be people who only care about playing games and social connection. You won't find any applications for building a spreadsheet calculator via voice command on the phones or tablets. It's unlikely that we ever will get to that style of "star trek" future of computing.

Also we still live in the day and age where consumers make no distinction between the internet and the web. Then again these people are not concerned with what kernel they are running or even if it's secure or as fast as can be. They are the market for these devices. As long as we have programmers, secretaries and any profession which deals with data input and entry there will always be physical keyboards.

My 2 bits.


----------



## nslay (Nov 25, 2012)

UNIXgod said:
			
		

> Someone at one of my local user groups showed me that they had ported editors/vim to the ipad (iOS). As interesting as it was there was no ESC key as it was using the system level keyboard widget. As a workaround he pointed out that they had made a "shortcut" with another symbol on it's keyboard. If such a thing is important enough not to be a toy to show to your friend someone will have to create a proper UNIX layout keyboard to use in conjunct with the software.


*yawn* vim on iPad. More of the same old ...



> I highly doubt anyone will be developing on these devices. Imagine writing a program by speaking to siri. Simple 4th gen interfaces could be created for end users such as we have now like "Compose message to Alyssa P. Hacker" but verbose COBOL style syntax had never really worked well for iron clad programming.


Why not? Machine learning and computer vision can _create_ realistic photographs from simple labeled sketches (here). Why can't an AI assist you in development through otherwise inefficient interfaces like touch screen and dictation?



> Even imagine administration (or even security) on mobile connected to server. Something as simple as chmod +x /path/to/file could be spoken as "change mode add execute home username bin program"


More of the same thinking ... How about content-based filesystems? "Show me pictures of my dog I took recently." No hint of file names or directory hierarchies (*because no human thinks of file names and directory hierarchies*). Nobody said information storage and security was limited to our current understanding of file systems.



> Now building on expansion pipes redirection and upper/lower case sigils and so forth a predetermined syntax will need to be created. With that we are looking at another dialect or another layer of abstraction. Most people who search google or web commerce sites don't know SQL. Nor should they have to. The keyboard will never really go away. It's easy to confuse the consumer market with the professional market. The problem is that computers became a commodity more of a toy than a productivity tool when they became home devices.
> 
> There will always be people who only care about playing games and social connection. You won't find any applications for building a spreadsheet calculator via voice command on the phones or tablets. It's unlikely that we ever will get to that style of "star trek" future of computing.
> 
> ...



We could be using punch cards and switches for data entry still. But it's not very efficient. The keyboard and later the mouse made human-computer communication more efficient. It took us 20 years to master the mouse and GUI. Now we have touch screens and dictation and we are still designing software with the keyboard and mouse in mind.

I refuse to believe the keyboard and mouse are the limit.

I do think Siri-like software is the future ... imagine, an NLP-widget toolkit. Not graphical widgets, but language-based _widgets_.


----------



## bbzz (Nov 25, 2012)

Keyboard-less? You are clueless.

I won't touch a phone without mechanical qwerty keyboard.


----------



## drhowarddrfine (Nov 25, 2012)

Although it's talking about Windows 8, this guy writes about the difference between input methods between a desktop and mobile devices which may apply to this.



> But this only holds if the original premise is correct, that the tablet is the evolution of the laptop, and I just donâ€™t think thatâ€™s right. Where the division lies is not a the desktop and the mobile level, or between the laptop and the tablet, but between professional use (i.e. content creation), and light/entertainment use (i.e content consumption). While tablets are not necessarily used purely for content consumption, their limitations (small screen size and lack of a hardware keyboard) mean that this will always be their main use.
> 
> The PC does not die just because there are more mobile devices on the market, it remains to play its own role. There is a clear line between devices you use for things like writing, coding, photo editing, 3D design, and so on, and devices you use for reading, browsing the web, watching videos and playing games. While the latter can be done on both, the tablet and the PC, the former will always require a PC, and because of this, there will always be a need for an operating system tailored specifically for it.


----------



## nslay (Nov 25, 2012)

drhowarddrfine said:
			
		

> Although it's talking about Windows 8, this guy writes about the difference between input methods between a desktop and mobile devices which may apply to this.



Agreed, tablets are quite the toy presently. But I don't think they have to be and I do think they can be productivity systems with a lot of UI work. Whether their plethora of input devices can be made to be more efficient than keyboards and mice remains to be seen ... I think they can be with some smarter software.

The screen size might increase some day ... if they ever become more than just a novelty.


----------



## UNIXgod (Nov 25, 2012)

nslay said:
			
		

> *yawn* vim on iPad. More of the same old ...



Didn't mean to put you to sleep there buddy. Ironic that someone spent time on porting the editor but couldn't implement a proper interface (hmmm... sound familiar?)



			
				nslay said:
			
		

> Why not? Machine learning and computer vision can _create_ realistic photographs from simple labeled sketches (here). Why can't an AI assist you in development through otherwise inefficient interfaces like touch screen and dictation?
> 
> More of the same thinking ... How about content-based filesystems? "Show me pictures of my dog I took recently." No hint of file names or directory hierarchies (*because no human thinks of file names and directory hierarchies*). Nobody said information storage and security was limited to our current understanding of file systems.



Never said it wasn't possible. No one really works on those types of things unless funding is in place. AI also didn't mature as quickly as it was thought of back in the 60's and 70's. 

If your passionate about this sort of thing there is nothing stopping you from building your own interface on top of the preexisting technology.

Though what your telling me is you'd like to create a spoken context search interface which is still an abstraction of regular expression and standard query. Assuming there is an open source library already available for speech recognition I would start by building against that with an active record pattern so it would be "Display pictures of dog by `date`.  of course date could be 4.days.ago to now or recent would be a preference based on integer value of time in range of int.min int.hour int.day and so on.



			
				nslay said:
			
		

> We could be using punch cards and switches for data entry still. But it's not very efficient. The keyboard and later the mouse made human-computer communication more efficient. It took us 20 years to master the mouse and GUI. Now we have touch screens and dictation and we are still designing software with the keyboard and mouse in mind.
> 
> I refuse to believe the keyboard and mouse are the limit.


GUI's and mice are a bit older than you think. Your off by a decade easy if not more. It's just another interface. comparability comes on layers. Interfaces are always separated from the logic as the logic is always separated from the data which could come on many forms, styles and media type. It's all about being as modular as possible to work with change as the world of computing is changing all around us all the time. 



			
				nslay said:
			
		

> I do think Siri-like software is the future ... imagine, an NLP-widget toolkit. Not graphical widgets, but language-based _widgets_.



Siri is more of less beta. Moore's law may fix that as the low level implementations become better as well as closer to real-time processing. As to the spec I explained above one would need to also build an interface to other engines (for example whatever android uses and blackberry, palm and M$ where applicable)... Since audio is dealt with on the user device transliterations will need to be done to be streamed over to your BSD server network.

As for neural networks and learning machines we haven't gotten much further than silly Markov chained pseudo software psychologists. There will always be a market for software which acts smart though. As for development we already have programmable programming (i.e. lisp, smalltalk, ruby). It's a far cry to expect us to get to actual programming with any other type of interface than one which allows text input. No one programs with a mouse. Some may compose like the arts with music, video or animation with 4th generation software tools which may be closer to being like IDEs.

Though I like the idea. If your future ever does happen where everything is truly automated it may be interesting to voice the command after buffer overflow "Computer, debug fusion impulse pointer while normalizing dilithium crystal memory segment; Number One to the bridge!" =)


----------



## nslay (Nov 25, 2012)

UNIXgod said:
			
		

> Siri is more of less beta. Moore's law may fix that as the low level implementations become better as well as closer to real-time processing. As to the spec I explained above one would need to also build an interface to other engines (for example whatever android uses and blackberry, palm and M$ where applicable)... Since audio is dealt with on the user device transliterations will need to be done to be streamed over to your BSD server network.


I don't know. I would hope that PCs and mobile devices continue to become more powerful. With more powerful devices and improved algorithms, perhaps this sort of toolkit can run on the devices itself. The _cloud_ may stifle that (all in the name of marketing/protecting IP). 



> As for neural networks and learning machines we haven't gotten much further than silly Markov chained pseudo software psychologists. There will always be a market for software which acts smart though. As for development we already have programmable programming (i.e. lisp, smalltalk, ruby). It's a far cry to expect us to get to actual programming with any other type of interface than one which allows text input. No one programs with a mouse. Some may compose like the arts with music, video or animation with 4th generation software tools which may be closer to being like IDEs.


Machine learning has advanced quite a lot (especially for computer vision tasks). I'm not sure about AI and NLP ... Watson is pretty impressive but I don't know very much about NLP.

With massively parallel computing emerging, I imagine new types of languages that are better suited to massive parallelism will appear. They'll probably be text-based ... but perhaps non-text-based languages are possible (workflows are kind of an example ... not always practical though).



> Though I like the idea. If your future ever does happen where everything is truly automated it may be interesting to voice the command after buffer overflow "Computer, debug fusion impulse pointer while normalizing dilithium crystal memory segment; Number One to the bridge!" =)



At some point, I imagine we may even rely on computers to automatically propose hypotheses (i.e. idea machines, computers that read journal/conference papers and propose unexplored connections). Prerequisite knowledge required for research is exploding. On top of that, it can be difficult to make interdisciplinary connections.

I don't care about star trek. A lot of this is already possible if you could design the software to neatly package it all (and that is a formidable task). However, scientists don't typically write end-user software (or write it well, some even use matlab alone). It may be decades before you see any of it in products.


----------



## drhowarddrfine (Nov 25, 2012)

nslay said:
			
		

> Agreed, tablets are quite the toy presently.
> 
> The screen size might increase some day ... if they ever become more than just a novelty.


You really think that? 

I won't go into how I see the iPad used in a lot of work places nowadays but my iPad goes with me every time I leave the office, go on trips, or just to the living room. There are a few companies that have converted them into cash registers.

Only recently did I get mine back from my son who borrowed it (for 8 months) but I need to find terminal software so I can use it instead of dragging my notebook around as much.


----------



## nslay (Nov 25, 2012)

drhowarddrfine said:
			
		

> You really think that?
> 
> I won't go into how I see the iPad used in a lot of work places nowadays but my iPad goes with me every time I leave the office, go on trips, or just to the living room. There are a few companies that have converted them into cash registers.
> 
> Only recently did I get mine back from my son who borrowed it (for 8 months) but I need to find terminal software so I can use it instead of dragging my notebook around as much.



Almost certainly. It is a burden (for example) to prepare presentations, write papers and program on tablets. Even Swype does not compare to a real keyboard for these tasks. Perhaps if you plug a keyboard into a tablet ...

That said, I do know of someone with no hands who programs in C++ with dictation. It is possible ... but I imagine it's not easy.

I do think smarter software can solve this problem, and perhaps do it more efficiently than with a keyboard or mouse.


----------



## drhowarddrfine (Nov 25, 2012)

nslay said:
			
		

> Almost certainly. It is a burden (for example) to prepare presentations, write papers and program on tablets.


That doesn't make them toys. They're not there for doing those thing necessarily.


----------



## UNIXgod (Nov 25, 2012)

drhowarddrfine said:
			
		

> That doesn't make them toys. They're not there for doing those thing necessarily.



I just dug this up which may be relevant to the conversation. It's actually a good read. I'm not one to link bomb but it's an interesting story:

http://joelrunyon.com/two3/an-unexpected-ass-kicking

The follow up is also worth reading... Link in the first article.

Conversation on hacker news:

https://news.ycombinator.com/item?id=4342790


----------



## _martin (Nov 25, 2012)

UNIXgod said:
			
		

> It's actually a good read. I'm not one to link bomb but it's an interesting story:
> http://joelrunyon.com/two3/an-unexpected-ass-kicking



Thanks for the link! That _"god stuff"_ in the end was weird for me, but the rest is really good reading.


----------



## UNIXgod (Nov 25, 2012)

matoatlantis said:
			
		

> Thanks for the link! That _"god stuff"_ in the end was weird for me, but the rest is really good reading.



Yeah it gets a bit hokey at the end. Far be it for me to judge anyone on ones personal beliefs; still a pretty epic blog post. Social commentary and all.


----------



## throAU (Nov 26, 2012)

User facing stuff may not continue with keyboards.  The machines that provide services will.


----------



## purgatori (Nov 26, 2012)

Until you have a computer that can interpret an ambiguously-worded instruction with the same accuracy as the average human, we're not even close to having an interface that can match a keyboard for efficiency. In other words, we're talking _Star Trek_-esque advanced AI:



> Scotty: Computer! Computer?
> [He's handed a mouse, and he speaks into it]
> Scotty: Hello, computer.
> Dr. Nichols: Just use the keyboard.
> Scotty: Keyboard. How quaint.


----------



## Deleted member 30996 (Nov 26, 2012)

purgatori said:
			
		

> Until you have a computer that can interpret an ambiguously-worded instruction with the same accuracy as the average human, we're not even close to having an interface that can match a keyboard for efficiency.



Talking to chat bots
Cold and lonely winter night - 
Echoing my words


----------



## ChalkBored (Nov 26, 2012)

nslay said:
			
		

> So how do you imagine human-Unix interaction in the future?



By walking up to a terminal, saying, "It's a Unix system! I know this!" and using it to fly around the filesystem so you can lock the door before the raptors get through.
Like it has been since 1993.

Except flying around with a tablet should be easier, since you can just use the accelerometers in the tablet to navigate by tilting it.


Passwords will be a 9 letter dictionary word displayed algorithmically as a 3d cube. I guess tilting and touchscreen gestures will help out there, too.

Coding will work the same way as passwords, but it will involve more complicated 3D rotating objects, that you piece together like a jigsaw puzzle. Nobody actually knows how it works, the people that do it for a living pretend to so they can keep their jobs, but they just wave their hands around while playing montage music in the background, and if it finally does something, they ship it.


All other text will be discarded completely in exchange for video recordings of whatever you were going to write.
Wikipedia articles will look like Max Headroom (the character, not the show), as they get revised/redacted.
Open Office will become a non-linear video editor that also makes spreadsheets. Microsoft will sue because they patented that for Office. Apple somehow manages to be the one hat invents it, after seeing that everyone else has it.

Ubuntu will require that you be assimilated. You will add your biological and technological distinctiveness to their operating system. Your culture will adapt to service Canonical. Resistance will be futile.
Plus, they've done testing that shows it's easier to use that way.


----------



## UNIXgod (Nov 26, 2012)

Trihexagonal said:
			
		

> Talking to chat bots
> Cold and lonely winter night -
> Echoing my words



Keep it on topic. We are talking about FreeBSD on these forums not Haiku or any other BeOS related crap. Also chat bots don't echo(). They printf()!


----------



## Deleted member 30996 (Nov 26, 2012)

UNIXgod said:
			
		

> Keep it on topic. We are talking about FreeBSD on these forums not Haiku or any other BeOS related crap.



I was on topic:



			
				purgatori said:
			
		

> Until you have a computer that can interpret an ambiguously-worded instruction with the same accuracy as the average human, we're not even close to having an interface that can match a keyboard for efficiency.



You just didn't understand it.



			
				UNIXgod said:
			
		

> Also chat bots don't echo().



Or understand chatbots like Daisy or Billy either, who build a database/mindfile of words that are input during conversation, thus echoing words. I have the 9th highest ranked bot at the PersonailtyForge, that also competed in The Loebner Prize in Artificial Intelligence Turing Test, have programmed ALICE bots at Pandorabots, and created mindfiles for both Billy and Daisy that I made available for download to the AI community, so don't try and school me on chatbots. 

In fact, you were the first one to bring up the subject:



			
				UNIXgod said:
			
		

> I wasn't referring to s for neural networks and learning machines we haven't gotten much further than silly Markov chained pseudo software psychologists.



However your knowledge of chatbots seems restricted to Eliza.


----------



## UNIXgod (Nov 26, 2012)

Trihexagonal said:
			
		

> Talking to chat bots
> Cold and lonely winter night -
> Echoing my words



Your quote above is a traditional haiku. My lame attempt at humor was to make a BeOS reference. 



			
				Trihexagonal said:
			
		

> I was on topic:
> 
> You just didn't understand it.
> 
> ...



Wasn't attempting to school you. Just a poorly executed attempt at word play. Actually now I'm interested. Pandorabots and PersonailtyForge look like very interesting sites to explore.


----------



## Deleted member 30996 (Nov 26, 2012)

UNIXgod said:
			
		

> Wasn't attempting to school you. Just a poorly executed attempt at word play. Actually now I'm interested. Pandorabots and PersonailtyForge look like very interesting sites to explore.



No problem.


----------



## Crivens (Nov 26, 2012)

Wherever you go, Scott Adams was there before...


----------



## sim (Dec 3, 2012)

drhowarddrfine said:
			
		

> That doesn't make them toys. They're not there for doing those thing necessarily.



Agreed.  Tablets are not (in my view) general purpose computers in the sense that we traditionally understand, and it is misguided to think of them as such and further to infer that they signpost the destiny for general computing.  I think these and similar devices belong to a new parallel class - 'Content Consumption Devices'. I have one, and it's great - within its problem domain.  

I think of more relevance to mainstream computing will be some of the enabling technologies that helped make modern tablet devices workable - ubiquitous network, data mining, prediction, behaviour analysis, 'cloud' intelligence etc etc.  These advanced technologies will still require (more than ever) smart people who can architect and program increasingly sophisticated large scale systems - the sort of systems that have traditionally had a solid foundation in Unix environments.

sim


----------



## drhowarddrfine (Dec 3, 2012)

I don't really like looking at them for consumption only and I'll give give three examples. A hospital's doctors I do work with use iPads to look at patient history and data and also use it for entering information. A restaurant chain I know uses iPads as cash registers. And, just a few days ago, I had maintenance done on my furnace. The tech did all the billing, credit card, confirmation email through his iPad.

So they seem to be taking the place of other handheld devices while offering more potential features and functionality. It would seem to offer a lot of freedom in a creative way.


----------



## vermaden (Dec 3, 2012)

nslay said:
			
		

> Admittedly, I'm not well versed in the details of Unix history. But I'm guessing that Unix didn't always have a tty subsystem. That it originally started with paper tape. Then it evolved to have ttys. Then it later evolved to have ptys and mouse support. What's next?



Currently the whole 'industry' makes the keyboard less useful and more 'stupid' unfortunately, check the keyboard layout between Dell Latitude E6410 and E6420, the difference between ThinkPad T420 and T430 and You will know what I mean.

Maybe that is their point, make it more 'stupid' and less useful to have an argument that the 'on screen' one is 'better'.


----------



## UNIXgod (Dec 3, 2012)

vermaden said:
			
		

> Currently the whole 'industry' makes the keyboard less useful and more 'stupid' unfortunately, check the keyboard layout between Dell Latitude E6410 and E6420, the difference between ThinkPad T420 and T430 and You will know what I mean.
> 
> Maybe that is their point, make it more 'stupid' and less useful to have an argument that the 'on screen' one is 'better'.



They did the same with the x220 to x230. Thinkpads are just generic machines now. No more decent laptops for programmers.


----------



## jwele (Dec 3, 2012)

I personally will never trade a keyboard for a touchscreen or touch based interface. I just couldn't live without the sound of a plastic key press in my life.


----------



## vermaden (Dec 3, 2012)

UNIXgod said:
			
		

> They did the same with the x220 to x230. Thinkpads are just generic machines now.


The change was generally between *20 and *30 models.



			
				UNIXgod said:
			
		

> No more decent laptops for programmers.


I would even say *'No more decent laptops.'*


----------



## throAU (Dec 4, 2012)

Until we have software that can work reliably up to the level of voice recognition, keyboards will still be reasonably commonplace.

Voice recognition, guestures, etc. are all nice, but if the daemon/process that provides those services crashes you're stuck.


----------



## nslay (Dec 4, 2012)

throAU said:
			
		

> Until we have software that can work reliably up to the level of voice recognition, keyboards will still be reasonably commonplace.
> 
> Voice recognition, guestures, etc. are all nice, but if the daemon/process that provides those services crashes you're stuck.



Everything else provided by the operating system is also software and prone to crash. Some of it can bring down the entire machine (or leave the machine hanging). What's the difference?


----------



## nslay (Dec 4, 2012)

vermaden said:
			
		

> Currently the whole 'industry' makes the keyboard less useful and more 'stupid' unfortunately, check the keyboard layout between Dell Latitude E6410 and E6420, the difference between ThinkPad T420 and T430 and You will know what I mean.
> 
> Maybe that is their point, make it more 'stupid' and less useful to have an argument that the 'on screen' one is 'better'.



Meanwhile, software continues to be dumber than a bag of hammers.

I personally believe smart software can overcome stupid input interfaces while being practical and convenient.

Dictation is about as stupid as it gets. But when NLP and AI advance enough so that you can hold a meaningful conversation with your computer, keyboards and mice will look really stupid and primitive by comparison. I mean, you could practically _ask_ your computer to do some challenging task that would otherwise require some heavy typing (as if you were asking another _person_ to do it).

But for now, I imagine smart software can already make pretty good use of something simpler, like a touch screen, while still being practical and convenient enough for productive work. Maybe even throw some dictation and accelerometers in there.

We only just mastered mice and GUI. We don't know jack about other types of input yet.


----------



## throAU (Dec 4, 2012)

nslay said:
			
		

> Everything else provided by the operating system is also software and prone to crash. Some of it can bring down the entire machine (or leave the machine hanging). What's the difference?



Add up the lines of code involved in a simple keyboard based TTY driver and compare to a voice recognition engine.

Compare RAM utilisation and CPU utilisation.

One of those programs will be a LOT easier to audit, debug and ensure is stable.


----------



## nslay (Dec 4, 2012)

throAU said:
			
		

> Add up the lines of code involved in a simple keyboard based TTY driver and compare to a voice recognition engine.


Why stop at the keyboard or even TTY? Anything can fail in the kernel and leave you equally stuck ... even unprovoked.

But while we're at it, if such a service is a user space process then that is definitely more stable than anything in kernel space.



> Compare RAM utilisation and CPU utilisation.


*yawn*

Besides, what do you base that on? What you _think_ would be required for such a system?



> One of thse programs will be a LOT easier to audit, debug and ensure is stable.



The types of the systems I imagine (and work with) build (train) themselves and are effectively blackboxes. All you have is statistical theory to make guarantees (and tried and tested maturity). You don't program smarts into software (and you wouldn't want to).


----------



## throAU (Dec 4, 2012)

nslay said:
			
		

> Why stop at the keyboard or even TTY? Anything can fail in the kernel and leave you equally stuck ... even unprovoked.



The TTY and kernel can't be omitted for administrative purposes.  They are the lowest level software and required to gain any sort of access to the box.

Note:  I'm *not talking about normal end user operation* here.

I am saying that the keyboard will remain for when things go pear shaped.



> But while we're at it, if such a service is a user space process then that is definitely more stable than anything in kernel space.



User vs kernel mode just means it won't take the entire system out (hard system crash) if it crashes in user mode.  

However, if your only method of interaction machine is broken, it really doesn't matter whether it is running in kernel space or user space - you can no longer interact with the machine.  

Running stuff in user mode is no magical cure for software bugs, and you'll find that the vast majority of remote exploits and software crashes on your box today are in fact in user space software.


----------



## drhowarddrfine (Dec 4, 2012)

nslay said:
			
		

> *yawn*


For a second I thought this was a Windows board. Or maybe Reddit. Can we show some semblance of respect here, please?


----------



## throAU (Dec 4, 2012)

drhowarddrfine said:
			
		

> For a second I thought this was a Windows board. Or maybe Reddit. Can we show some semblance of respect here, please?



No, it's fine.

It gives you a clear indication of noobs who have no idea what they're talking about, when they resort to such posts because they can't actually form a coherent argument.


----------



## KenJackson (Dec 4, 2012)

nslay said:
			
		

> Keyboards/mice may go away in the distant future and there are already devices that lack these. So, how does a human interface with Unix now?
> 
> So how do you imagine human-Unix interaction in the future?



The great prophet Vernor Vinge has already given us the answer in Rainbows End, page 105:



> He felt a moment of pure joy the first time he managed to type a query on a phantom keyboard and view the Google response floating in the air before him.



He also talks about kids that could learn to use the fidget interface so they could text each other in class without being noticed.  But older people have trouble learning the fidget interface, so they tend to stick with the phantom keyboard.

This, of course, was written as fiction, but it's not as extreme as I thought it was when I read it.  Shortly thereafter I read that someone was actually working on a prototype computer display in a contact lens.  And as for the phantom keyboard--if computers can identify people from camera images, why couldn't they see what you're typing without the keyboard?


----------



## vermaden (Dec 4, 2012)

nslay said:
			
		

> Meanwhile, software continues to be dumber than a bag of hammers.
> 
> I personally believe smart software can overcome stupid input interfaces while being practical and convenient.
> 
> ...



I do not have anything against the new or/and better ways of 'telling' the computer what You want from 'him', but as everything in the computing, also the keyboards should be better and better as times passes by, not less and less usable and more stupid.


----------



## fonz (Dec 4, 2012)

nslay said:
			
		

> Keyboards/mice may go away in the distant future and there are already devices that lack these. So, how does a human interface with Unix now?


With datagloves and eyephones. Ask Johnny Mnemonic, he knows.

Fonz


----------



## register88 (Dec 4, 2012)

Sometime, I think coding with voice input is nice. ï¿½e


----------



## nslay (Dec 4, 2012)

drhowarddrfine said:
			
		

> For a second I thought this was a Windows board. Or maybe Reddit. Can we show some semblance of respect here, please?



I think you read too much into it.


----------



## nslay (Dec 4, 2012)

throAU said:
			
		

> No, it's fine.
> 
> It gives you a clear indication of noobs who have no idea what they're talking about, when they resort to such posts because they can't actually form a coherent argument.



You also read too much into it. CPU and RAM is a more expendable resource than it used to be.

You're the newbie here. You still haven't told me what you base your claim on



> Compare RAM utilisation and CPU utilisation.


----------



## Crivens (Dec 4, 2012)

fonz said:
			
		

> With datagloves and eyephones. Ask Johnny Mnemonic, he knows.
> 
> Fonz



I also think this is going to be the way to go. Not for everything, but for most end-users this can be the best. I have read "Daemon" these days and I feel that this is one kind of end user interface which can be accepted by most users. Aside from that, I can highly recommend this book and the second one, they are real thought-provokers.

The interface using a HUD in your glasses has a big advantage, no one can easily look over your shoulder what you are browsing 

Also, it saves lots of space and material since no big 100" monitors need to be produced and shipped around the globe. Think environment.

What I would see as a disadvantage is the limited input bandwidth when it comes to non-fuzzy data, as in putting in numbers or source code. I would not want to be seen or heard intoning the magic "#include <stdio>" or chant to the dark "template of templates of templates".
Chuck Moores' color forth keyboard has only a few keys, and you could cover the basic input to the system, for numbers, test, whatever, in only some hundert lines. That would be enough to provide exact input, the more complex gesture recognition would come later. 

So I see other typed of keyboards and mice in the future, also other GUIs or command lines. But I do see little future for touch screens.


----------



## throAU (Dec 5, 2012)

nslay said:
			
		

> You also read too much into it. CPU and RAM is a more expendable resource than it used to be.
> 
> You're the newbie here. You still haven't told me what you base your claim on



Despite my join date, I've been looking after production unix systems and getting paid for it since 1996.

OK, system is running like a pig and swapping like mad and you need to log into fix it.

What is more likely to work?  A voice control interface, or a terminal?

I'm not saying new UIs are bad.  I'm saying that low level troubleshooting is going to require low level tools.

Despite the proliferation of high level languages, *some *people are still required to write in assembler.  This will be similar.

Sure, 90% plus of the future population will never touch a keyboard.  Some will though.


----------



## nslay (Dec 5, 2012)

throAU said:
			
		

> Despite my join date, I've been looking after production unix systems and getting paid for it since 1996.
> 
> OK, system is running like a pig and swapping like mad and you need to log into fix it.
> 
> ...



I couldn't say. Such a system doesn't exist ... the closest thing to it is Siri which runs on a remote server farm. If your Internet connection is severed, then I'd agree, you'd definitely need something else (like a keyboard) in such a case.

Complexity doesn't necessarily mean it's horribly unreliable or inefficient. What's more likely to work, a bicycle or a car? I'd say the bicycle, but the car is pretty reliable too.


----------



## throAU (Dec 10, 2012)

nslay said:
			
		

> Complexity doesn't necessarily mean it's horribly unreliable or inefficient.



No, complexity doesn't *guarantee *that your software is less reliable or inefficient, but it has historically proven to be be a fairly reliable indicator.

No matter how efficient you make your code at what it does, equivalent quality code that DOES LESS will consume less resources, be easier to debug and more difficult to exploit.


Put this way:  how many product recalls do you see on new bicycles, compared to new cars?

We've been building cars for say, ~100 years now.  And yet we still put out new cars with recall-worthy faults, despite most new vehicles going through a development and testing process that is far in excess of what the average bicycle design endures.


----------



## nslay (Dec 11, 2012)

throAU said:
			
		

> No, complexity doesn't *guarantee *that your software is less reliable or inefficient, but it has historically proven to be be a fairly reliable indicator.


Historically, we've been advancing in technology ... and it's still pretty reliable (sometimes even more-so than older and simpler technology). Granted, more parts does make for more points of failure.



> No matter how efficient you make your code at what it does, equivalent quality code that DOES LESS will consume less resources, be easier to debug and more difficult to exploit.



The real kicker is that machine learning methods build themselves (small chance for human error), they're algorithmically simple, and typically very efficient. But they're also often black boxes ... small price to pay for smarts.



> We've been building cars for say, ~100 years now.  And yet we still put out new cars with recall-worthy faults, despite most new vehicles going through a development and testing process that is far in excess of what the average bicycle design endures.



So, what are you suggesting? That we shouldn't try to replace the keyboard and mouse with something better because they're simple? By that logic, we should still be using horse and carriage or even bicycles. Those two are very reliable and extremely simple and the former is pretty practical for longer commutes.

That reminds me, horse and carriage have largely vanished and I anticipate the same for the keyboard in the distant future, much like its simpler predecessors: the punch card, paper tape, switches, etc... 

If the keyboard exists 100 years from now, then I'd agree, it might be purely for diagnostic purposes. I'm curious how UNIX will adapt to a keyboard-less environment.


----------

