Avoiding The Uncanny Valley of User Interface

I agree with this completely. Two cases, any web-based mail program (at least the big three, Google, MS and Yahoo) and any web-based document applications (Office clones in web-browsers).

It turns out that all people look alike. There aren’t actually that many differences from one person’s face to another. This is counter-intuitive, as each of us can easily identify the face of a friend out of thousands of pictures, but that’s just the point. Our brains actually recognize faces and then scrutinize them further so that we can pick out subtle differences.

What does this have to do with the Uncanny Valley? More than user interfaces, I assure you.

You see, the uncanny valley is the result of our brain recognizing something as a human face, and then paying much more attention to it. If a 3d model is not lifelike enough to warrant this extra attention, then we consciously accept it for what it represents. Once our brain starts to scrutinize the 3d model, it is going to start to recognize flaws that were previously ignored. This results in the model appearing less lifelike despite becoming more realistic.

Our brain does not scrutinize user interfaces the same way, so the idea of an uncanny value for UIs just doesn’t make sense. Anything that one finds uncanny is the result of preconceptions made about how a UI should work and look, which means that this post is merely an argument to maintain the status quo.

I experienced this reaction recently when I was shown [because I would have never tried this on a web app] that I can move events on Google Calender from one day to another via drag-drop. One half of me thought this is cool the other half of me thought this is wrong.

But perhaps this was the same reaction some had when they first dragged a Google map. What criteria is there to distinguish between Uncanny Valley and Progress?

I agree with your points that desktop apps should adhere to OS conventions, but I cannot define what constitutes what is and is not proper web app behavior. And if I can’t define behavior limits, then I can only conclude this is progress. Despite my initial gut reaction.

I agree that a web user interface that looks and feels like it should be on a desktop is unappealing to me. It seems to me like web applications that look like they belong on a desktop don’t reach my expectations because any difference in its interaction with me is conceived as a weakness rather than a mere difference. Such as the way web applications must some times refresh in order to show updated data on a page. Although the web application may be more robust and user friendly little things like this turn me off. On the other hand if it’s designed to look like a web application I will tend to understand its behavior.

That is also discussed a lot in video games. Not only in characters but in scenery too, for a time now it’s being discussed how the realistic video games have become less color vibrant and more grey.

Also in video games characters look so real that they feel odd and unconvincing. I feel that a lot while playing Fallout 3, I can’t take the characters seriously. While in Elder Scrolls 4 Oblivion (which uses the same engine but is a older game) I find the characters a lot more convincing because they were not made to really look like a normal human.

This is particularly noticeable with cross-platform UI libs like wxWindows, GTK, Qt and SWT; they are often ALMOST right in a very irritating way; the departures from native behaviour become very irritating. See Eclipse on a Mac for a good example.

For a software developer - yes.
For an average user - no. An average user simply doesn’t care if it’s real, doesn’t search for differences, and for him/her it would be the best if the web application and the desktop application would behave the same, because it would be easier to get used to.

For a developer a software is close and personal, like another person for everyone. That’s why it feels unnatural for him/her to use web apps that try to mimic desktop apps’ behaviour. But for an average user, web and desktop apps are nothing but utilies, things that have to be used in order to achieve a goal. They don’t care, it’s nothing personal for them. (Sorry for breaking your heart, you just have to realise this and move on with your life. :))

What are you guys talking about! The purpose and intent of Ajax and RIA technologies is to enable web UI designers with the ability to do things that would be considered closer to desktop application operations than traditional web applications.

Thats because traditional stateless web application user interfaces sucked. What are web application expectation anyway? Type stuff… submit(postback)…wait…view result. I say rock on web ui designers! Give me drag-drop, give me background updating panels(event-driven updates). I am still waiting for some of those other crusty old desktop features like great undo/redo functionality and the ability to paste an image directly into an email body, but as they keep working the technology I am sure it’s not far off.

For users, web apps accel because of collaboration and accessability. Users dump outlook for web-based email so they can read their mail from home, work, school, wherever. Certianly not because Outlook’s desktop user interface sucks. Users have had to trade off rich interaction for those benefits. Today, that trade off isnt any where near as bad as it once was. Today many web apps simply rock. And thats because of the energy and effort by many folks to bring rich (or desktop-like) interactions to the web. So lets dispense with the noise that this is a bad thing.

Hmm… i love itunes on my pc. The functionality is so much better than the rest of the players. The defining triumph of the apple corp.

I felt the same way about the Linux desktops that came out a few years ago. They acted just enough like Windows to raise expectations, but they didn’t deliver. They’ve gotten much better now that they’re not trying to mock Windows 95 and they’ve moved on to doing their own thing.

I now use Linux desktops and have the same feelings about Windows …

Data in startrek avoided the uncanny valley by being played by an actor … so he was far enough beyond the valley to not cause the uneasy feeling …

Perhaps the concept explains World of Warcraft’s relative success over more graphically, err, intense games. Cartoonish characters and settings allow us to think of it as a game.

Interesting idea, thanks!

James @ 7:40 has hit it right on the nose.

The uncanny valley metaphor is a good one when talking about virtual representations of people. Everyone has two eyes, a nose, and a mouth, everyone has certain facial tics and expressions and so forth. Each and every one of our brains are wired to look for these characteristics to help determine whether someone is human or not.

The reason the uncanny valley metaphor doesn’t apply to UI design is because not everyone has that same built-in mental model of how things should be, because not everyone is hard-wired to use the same features of the same set of apps in the exact same way. To stick with the email client example, I haven’t used a desktop email app since 1999. As far as I’m concerned Outlook and Thunderbird et. al. are broken if they don’t behave the way Yahoo mail or Gmail does, since Web clients are what I’ve based my mental model on.

Mmm Dead Rising, that game was fun :slight_smile:

@Shane - There is a difference between the behavior that Ajax allows (updating without a refresh) and the visual cues of desktop-like chrome/widgets. Split panes, trees, data-grids etc all give certain expectations.

Look closely at gmail. Notice that it is a blend between desktop capability and a webby interface. It creates its own look and feel that offers rich functionality while not looking out of place.

Not so sure I agree with you Jeff,

Although I found the example of the uncanny valley in robotics very interesting I’m not sure how relevant it is to UI. I think its important to remember that desktop and web apps are themselves perhaps only a decade old.

Having worked with hospital staff I’ve seen how they reacted to new spreadsheets designed to mimic the paper versions of documents they were used to working with. A similar ‘uncanny effect’ was also very evident then when the nurses focused on every difference between the new electronic systems and the paper forms they were used to.

However, with some hindsight, the reaction of the nurses is simply the same reaction 90% of humans will make when presented with change. We tend to focus on the negative.

My personal thoughts on Zimbra’s tactic of mimicking Outlook is that they have chosen a solid base, one people are very familiar with, and that they can now build from this. Any complaints people may have regarding any slight differences are insignificant (in the sense that they can address any issues raised), what is significant is that a lot of people have adopted the app because it feels familiar.

I think we are in the middle of a paradigm shift. Not a revolutionary one, but a significant one. This means change and any change is certain to be accompanied with its fair share of baby ducks (a href=http://en.wikipedia.org/wiki/Baby_Duck_Syndromehttp://en.wikipedia.org/wiki/Baby_Duck_Syndrome/a)

When it comes to UI design, I think the uncanny valley often appears from improper use the accepted conventions. In The Design of Everyday Things, Donald Norman points out that there is no rational reason why the hot water faucet is on our left and the cold on the right but because that has evolved to be the standard convention (at least in the States) any departure from that scheme becomes problematic. Agreeing with what others have said about how humans examine faces for subtle clues, I think we also examine a UI for subtle clues within the context of it’s environment. Windows vs. Mac being obviously different environments.

Part of the problem is that we have people that ignore the standard convention, place the hot water faucet above the cold and call it progress. Part of the problem is that we should have developed a whole new and different environment for web-apps and we didn’t. We merely attempted to copy the existing paradigm into the new environment. We enshrine the convention when we shouldn’t and throw it out when we should keep it.

It’s kind of like when a client asks me to draw checks boxes on a computer generated form. They are simply dragging the old accepted convention forward and I’m going WTF! Do you know why there checks boxes on your old hand-printed form in the first place?

Nothing more annoying than a web application that breaks the back button.

I think this isn’t exactly about the difference between desktop apps and web apps. We don’t think all desktop apps are cuddly and all web apps creep us out, right?

Alan Cooper observed that even in desktop apps, if you change the responsiveness, even to speed it up, people complain the new version is slow. People have a rhythm to their use of software, and if the rhythm is changed, they are put off. Even with AJAX and Flex and other RIA stuff, a web app will never have the same rhythm as a desktop app.

Is our instinctive sense of rhythm the analog of our natural face-recognition senses? Does or sense of rhythm of a software UI relate to our ability to sense music?

Reminds me of that horrible app, ‘Outlook Web Access’ for Exchange servers. Eek.

iTunes for Max is another great example.

Also, that’s probably why Google Docs didn’t quite catch on… Although they do have very good web-based features their product still looks too much like a desktop application to be user friendly.

I work on Windows applications and we also have some online equivalents. Except for the menu bar at the top of IE they look almost exactly like. We do have the occasional user that gets confused about the difference between the due. Working online generally means you are diving into a database and not a physical file. They are flabbergasted to find out that File - Open doesn’t really open the file on their desktop but the equivalent stored in the belly of the server.

While most users enjoy the collaborative nature of working with the online version the occasional users hate the slower responsiveness (because opening a 10K-row datatable over ethernet is never going to be like the desktop equivalent), enhanced security, and different layout. But I think this product perfectly fits into the online world. Possibly even better as an online application than a windows application