Hacker News new | past | comments | ask | show | jobs | submit login
The future of UI will be boring (scottberkun.com)
66 points by dan_sim on Jan 15, 2010 | hide | past | favorite | 36 comments



Interesting that he left out the iPhone. To me, that is the most recent big innovation in usability. Of course, it is a different form factor than the PC, and that is the point. There is not much reason for a radical change in interface if the form factor doesn't change.

The other big example is Wii, and now what Microsoft has been demoing in their Natal project (assuming it ships in the near future). In these cases, the key factor is that you are operating a device from a distance. That allows the use of a wider range of human motion in controlling the device, now that the technology to support it exists.

I suppose the lesson is, form factor and innovative interface design cannot be decoupled.


From the article: "The fancy shmancy argument is that dominant design repels most attacks. There are lots of bad ideas that were adopted first, became dominant, and have been impossible to shake. The DVORAK vs. QWERTY keyboard debate is a canonical example. It doesn’t matter if DVORAK is actually 5x better that QWERTY, the cost of relearning is perceived to be prohibitive, so most people never have the motivation to try, and there are huge reinforcements of the status quo (e.g. people who teach typing classes). Metric system vs. English in the U.S. is another good example. A particularly retarded example of dominant design is electric plugs. Studying why the world has 50 different plugs and voltages explains much about resistance factors against innovation. Or world peace."

This is a actually a favorite nitpick of mine. I don't really buy the argument that established standards dominate even after they've been proven to be inferior. First of all, the Dvorak vs. Qwerty situation isn't as clear cut as it is commonly made out to be -- while there's lots of anecdotal evidence, the few independent studies (i.e. those not done by Dvorak himself) aren't very conclusive. At the very least they don't show a 5x improvement.

The cost of switching to the metric system likely isn't as great as commonly believed, and there's no harm in running both side-by-side for a while. There are many precedents for this from other countries. While as a European, I consider the imperial system to be clearly inferior, I think the real reason why the U.S. doesn't adopt it has more to do with emotional attachment and xenophobia than with cost. The point here is, if Americans genuinely consider the imperial system to be better, switching would not be an improvement.

The electric plug situation is rapidly improving, in part because of homogenization pressure, in part because electronic devices don't particularly care what kind of voltage you feed into them. (Be careful with adapter plugs and hairdryers though, you might start a fire if the voltage is too high.) Also, this is a case where none of the existing standards is inferior to any other, so it's not even an example of a bad design becoming dominant.

In short, my point is that if a new convention is clearly better, it's usually possible to switch gradually, and that this is usually done. The effect of "dominant design" is greatly exaggerated.


First of all, the Dvorak vs. Qwerty situation isn't as clear cut as it is commonly made out to be

Indeed. Reason Magazine did an in-depth article on this a few years back.

http://reason.com/archives/1996/06/01/typing-errors


Unfortunately, that article isn't really so much a defense of Qwerty as it is a defense of free-market fundamentalism. The writer doesn't seem to care as much about usability as he does about attacking the credibility of as many free market-mistrusting Dvorak supporters as he can track down. Not to say that he's necessarily wrong, but the article is really only superficially about keyboards.

Ultimately, it is indeed not as clear-cut a situation as it's made out to be, either in Dvorak's or Qwerty's favor.


The classic disproof of the Dvorak superiority to qwerty is "The Fable of the Keys", here http://www.utdallas.edu/~liebowit/keys1.html , which was also the lead chapter in the authors' book "Losers, Winners, and Microsoft", a general attack on network (or lock-in) effects.


Good thread guys. Couple of things:

The specific rant was aimed mostly at desktop. There's lots of energy to move i-phonish UI to the desktop, but it will mostly fail there for ergonomic reasons.

Sci-fi movies are horrible predictors of future, especially when it comes to UI. Fun and inspiring which is good, but using it as something to copy is ridiculous.

The 3rd world argument is a good one. If people can skip past our desktops they dodge the dominant design issues. But then it also has to be cheap. Cell phones are a great story - cell phones are the first phones much of the world has ever had. Cell towers are cheaper than landlines.

> jorsh wrote: > I'm paying attention to the applications, > not the OS windows.

He's totally right, You rarely need a paradigm shift to achieve whatever it is you imagine the effect of your work to be. That was my point at the end.

-Scott (www.scottberkun,com)


Computers 30 years ago: http://www.classiccmp.org/dunfield/c64/h/complete.jpg

Computers now: http://www.digitaltrends.com/wp-content/uploads/2009/11/appl...

Sure, the basics are the same. However, nothing about these changes are "boring." I would argue we have nothing to worry about. In 5, 10, 15 years, we'll have some pretty cool gadgets to play around with.

Take a look at cell phones- everything is starting to use touch screens. Tablets are going to start becoming popular, and eventually that will seep into desktops and laptops.

Don't worry, we'll be fine.


I'm still not convinced that tablets will be popular in the near future. The current paradigm of tablet PCs–make the screen touch sensitive and chop off the keyboard–has major ergonomic and usability problems. How do you hold it for more than 5 minutes? How do you input text without it being a complete chore? The software and hardware will require radical changes to solve these problems. Maybe Apple has thought of something, but a big iPod Touch won't work.

Voice recognition isn't viable either. Do people really want to talk to these machines all day–in offices, subways, coffee shops? Doubtful.


I think people who believe tablets are going to replace the PC are missing the point. One of the big concepts we're building at my work is the idea of devices that are designed around limited input scenarios.

In other words we believe there's room for two types of devices: Viewing devices with limited input (like tablets) and input devices (like PCs)

To give a personal example the most obvious usage case here is the e-book. If you could have every book you've ever owned at the touch of your finger that would be worth something even if you were limited to basic input on the device. Now combine e-books with all your movies and music and you start to see how there's room for devices with limited input.

(For the record we've been using ELO Touch Screens to test usage scenarios until the perfect tablet shows up)


You make some good points. I think there is some niche potential for tablet devices--hospitals, drawing, books (as you mentioned).

On the other hand, e-book readers have found success by doing things laptops can't, like lasting for a week on a charge, being legible in bright sunlight, and being extremely lightweight. I don't see the same happening for movies (laptops are better for movies since you don't have to hold them up) or music (iPhone/Android wins for that).

Also, e-book readers are replacing books, so while you're carrying another device around, it's probably a net reduction in size and weight for most people.

Here's a concept that would sell like hotcakes: Laptop with a dual mode LCD/e-ink 10", touch-sensitive, swivel display that can be held like a tablet, 24hr (or more) battery life, half-inch thick and less than a pound. Not sure if it's technically possible right now, and I'm sure it'd be quite expensive if it is, but when such a device becomes widely affordable, these niche tablet/e-book use-cases would just be part of your laptop.


Not exactly the same but take a look at the Lenovo U1 hybrid http://www.engadget.com/2010/01/05/lenovo-ideapad-u1-hybrid-...


Here's a startup with a commercial version of the OLPC display:

http://spectrum.ieee.org/computing/hardware/winner-pixel-qis...


Larger tablets (i.e. beyond the iPhone size) work just fine with handwriting recognition, it's worked great for over 10 years now. Sure, you need a stylus rather than a finger, but again on a larger tablet, finding a couple of mm for a stylus holder is easy, and you can fall back to virtual keyboard for short-duty entry where a stylus is impractical.

When I used to use handwriting recognition - back about when OSX first came out, I found it to be far more comfortable than typing, and I'm a fairly decent touch-typist (~100wpm).


The current paradigm of tablet PCs–make the screen touch sensitive and chop off the keyboard–has major ergonomic and usability problems.

You can have tactile feedback with touch screens.


I think keyboard/mouse -> touch is, itself, a trivial change. If we're just tapping virtual keyboards and 'clicking' on program icons with our fingers, what has really changed?

I think the kind of paradigm shift Gruber was getting at, is away from things like hierarchical 'file' systems and proprietary application-centric file formats.

Personally I think the biggest missed-innovation in the iPhone SDK was not having Newton-esque app-agnostic shared data stores.


API > Shared data store, for most cases.


This is something I've been thinking for ages; the future probably won't look like the scifi movie version of the future with holoscreens and voice recognition and gestures. Useful always beats cool in the long run, no matter how you market it.

Though I still believe there's a strong chance that some unforeseen UI innovation will appear and change the game.


Exactly, certain ideas seem perpetually "futuristic" but never seem to catch on: voice recognition ("Imagine dictating your letters!" => Really annoying to make corrections), gestures ("Imagine waving instead of using a mouse!" => Gorilla Arm), video chat (I want to wear my pajamas and not pay too much attention), and so on.

That said, I don't want to discount some new innovation taking over :).


Dictating letters is still niche and will likely remain so, except for people who have severe RSI or other reasons to go through the training required. If I'm spending enough time at one sitting to do a letter, and if I have the investment in learning how to touch type, then I'm a lot faster at typing than dictating. Even if the dictation is perfect.

On the other hand, voice recognition for specialized vocabularies is happening. I've noticed a progressive replacement of "push 1 for X, push 2 for Y" with voice interfaces in support calls. My car has voice recognition for the sound system that actually works, e.g. "Play Track Shelter" plays a song of that name. While there are still freaky behaviors, these work well in that they don't waste my time when I'd rather be doing something else -- like having my problem fixed or listening to Icon of Coil. Because they are for specific narrow domains, these applications of voice recognition coexist with my typing, touch, et al.

Put another way, new UIs and input devices seem to work best when they have a path for incremental adoption. It's rare to see a complete overnight change. Even the iPhone, as amazing and revolutionary as it is, built on pre-existing experience with touch screen phones. (Maybe "reacted against" is a better term than "built on.") Plus the iPhone still has a keyboard, even if it's a neat touchscreen one instead of a hardware one.


While this is a wonderful article, there's a situation the OP failed to mention: country differences

Just a simple example: US has a wide usage of 1G mobile phones, then China has a huge 2G GSM market, now the US goes from 1G directly to 3G, China should prepare 4G now.

I think this cycle has not yet been facilitated in the current globalization, plus peer competition can invoke more innovations. It's like cold war without hostility.


Well, I'm working on a new UI/GUI applicable to any device, and HCI/BMI integration of it. It's not boring at all: usable, sci-fi like, Voice recon, etc. I can't disclose much beacuse it's a research project (NDA). Perhaps in the future I may succed to make it public.

@IsaacL: Well, future UIs will be like those on scifi movies, but with usability in mind. I mean a UI that don't let you do what you want to do, it's pretty useless.


Out of interest, and I hope you can answer this despite the NDA, was 'sci-fi like' what you planned to achieve from the outset, or did it come about as a result of you designing a UI?

(i.e did you start designing a UI and end up with something sci-fi like, or did you take a sci-fi interface and try to make it useable?)


Well it started as a better UI than Google's (I said too much here), so I started designing the UI with search as a focus. Than later it evolved.

As for the usable thing, most of UIs I design are usable from the start, in the sense that they don't suffer the designer syndrome.

If you want to see a scifi that suffers usability

Mozilla's Aurora Concept http://www.adaptivepath.com/aurora/


fh wrote "In short, my point is that if a new convention is clearly better, it's usually possible to switch gradually, and that this is usually done."

I'd love to see any data at all about this. Or even a few not cherry picked examples.

You also wrote: "The electric plug situation is rapidly improving".

I have seen zero evidence for this. I've yet to travel to any other continent without this being a problem.


(Maybe this was supposed to be a reply, instead of a top-level post?)

Anyways, as little as 20 years ago, you'd routinely need adapters when traveling from one European country to the next. That's hardly an issue anymore, and I consider this a vast improvement.

You say you have problems when traveling to "other continents", without saying which continent you start from. I'll assume you're from North America, because then you're indeed a bit out of luck, as this map shows: http://upload.wikimedia.org/wikipedia/commons/d/d7/WorldMap_... However, that's a North American problem, not an international one.

As to your other point, I didn't cherry pick the examples, I addressed the examples from an article that argues the exact opposite.


I don't understand your retort about plug differences being a "North American problem." Just as an American needs adapters in Europe, a European needs adapters in the U.S. Not to mention that everyone is still potentially screwed across most of South America and Africa, and, indeed, most of Asia.


Welcome to Hacker News (non-sarcastically!). Comments are threaded here, so you could have replied directly to fh's post.


I'm looking forward to the time I can program/controll my computer using entirely voice commands and virtual 3d holographic typing


I, too, look forward to a day when I'm at work, and am surrounded by people mumbling at their computers and waving their arms.

No, sir, that won't be distracting in the least.


I am reminded of the day when a co-worker of mine had an ergonomic evaluation done. When the ergonomist(?) saw what contorted way he typed (and hunt and peck at that), which apparently he had been doing forever, she could only suggest he might try voice recognition software. Thankfully he didn't take her advice.


Subvocalization?


Boring? God I hope so. Classic MacOS had the right idea aesthetics-wise. Boring, clean, neutral grays. I'm paying attention to the applications, not the OS windows.


Class is not the same as boring and the iPhone doesn’t use flashy colors anywhere (except where flashy colors are appropriate: to help distinguish between different apps, for the low battery warning, the reminder badge, the slider to turn it off, you get the idea). Nobody in their right mind would call the iPhone’s UI boring. Never mind the subdued dull blue and grey.

OS X was candy colored. A decade ago. The iPhone lost the candy, Mac OS X has some vestigial buttons and scrollbars but is otherwise very much moving towards battleship grey. But its past, present and future aren’t exactly boring.


Excuse me, who is this douche and why do I care? I find it hard to believe that anyone who considers the semicolon "vestigial emoticon fodder" has written anything of much significance or value.


Honestly when I read that I thought the same thing, but just because he isn't 'one of us' doesn't mean his points about UI aren't valid. There's been a lot of good writing done by people who don't know how to terminate in C.


C? Try English. It is an ill-developed style that doesn't make judicious effective use of the semicolon.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: