Of all the professional hubris I've observed in software developers, perhaps the greatest sin of all is that we consider ourselves typical users. We use the computer obsessively, we know a lot about how it works, we even give advice to friends and relatives. We are experts. Who could possibly design software better than us superusers? What most developers don't realize is how freakishly outside the norm we are. We're not even remotely average-- we are the edge conditions. I've often told program managers: if you are letting me design your software, your project is in trouble.
This is a companion discussion topic for the original blog entry at: http://www.codinghorror.com/blog/2004/09/the-rise-and-fall-of-homo-logicus.html
Indeed - who was it who first observed that making things complex is a simple task, while making things simple is complex?
“Indeed - who was it who first observed that making things complex is a simple task, while making things simple is complex?”
I’m thinking of UNIX and LINUX in particular. How many years have they had to develop a decent UI? 20? 25? And counting. It’s a hard problem, and developers aren’t good at it. Linux is the hardest of the hard core developers and they arguably suck the most at it. Which is kind of what I was saying.
“We are not one-dimensional automotons.”
Well, like most generalizations, it exists because it’s true most of the time. Many developers are terrible at UI design, largely because they can’t relate to “normal” users. Is it possible for developers to transcend this limitation and moonlight as UI designers? Sure. But bear in mind this is additional work above and beyond traditional software engineering skills, which is quite complicated already. It’s a very full plate, which is why I like to see a division of labor. Developers of course should have input, but only those developers who have the discipline to realize that they are NOT representative users!
It is something I wish more developers would study and cultivate (buy them a copy of Krug’s Don’t Make Me Think!), because it’s a huge deficiency shared by almost all the developers I’ve known.
“What’s this, a return to the waterfall, except that we’re doing ‘interaction design’ instead of requirements? That’s proven not to work.”
It’s a good point, and I’ve seen it fail in exactly this way. You have to do mini-cycles, and overlap with UI design quite a bit. There’s usually lots of up-front project work to do anyway: researching third party tools, writing proof of concept code for tricky parts, back end design, etc.
I think that you’re falling into the same trap that Alan Cooper has been for years. While programmers clearly interact with software on different footing than other users, it’s well within the realm of the possible for programmers to get outside their own heads, listen, and think like end users. We are not one-dimensional automotons.
I don’t like what Alan Cooper proposes, which is to do ALL interaction design before beginning coding. What’s this, a return to the waterfall, except that we’re doing “interaction design” instead of requirements? That’s proven not to work.
Are the points he raises good? Yes. Is his proposed solution comical in its naivety? Yes again.
Though I get paid to do some programming work, I usually build a UI first in all projects I do. I make the UI work and function for what is needed and then I may go back and rework the code to be slightly more productive.
I know this is a backwards approach and it’s not the best method simply because I have numerous old projects that only have a UI shell and no actual value as a program. Sure it looks pretty but that’s about it. Then again I’ve always made little helper apps that would get some of my work done quicker, I never design full blown ERP packages or CRM suites (though I’ve been tempted to recently).
I can say that it’s difficult to think like an end user but even then a developer typically cannot find all of the problems a typical end user can.
A solution to this would be to get friends and family to be your beta testers. I may be the only one who is basically employed as my family and friends personal PC Repair Shop so I figured it should be time that some of them paid me back. How can they do that? Beta testing. Since they already prove they can break their OWN PC, I can give them an application and let them do what they do best. It’s the least they could do since they never offer anything in return and expect me to cater to their every computer-related whim.
I remember the last job interview that I had - after the interview (they were sure they were going to hire me) they said something along the lines of
you shouldn’t undercut yourself like that. You might not know that most users have no idea what Linux is, and even the better half of them think it’s the company that made the router they have at home for their son’s xbox.
How were you undercutting yourself?
It’s well within the realm of the possible for programmers to get outside their own heads, listen, and think like end users.
Problem is, they normally end up thinking like they think users think; or worse, thinking like they think users should think.
- Type A → The user is such a moron that he needs a cute little paperclip to show him what to do.
- Type B → The user should want to learn at least 5 new commands before opening a document and typing it.
This is me, all the way. I look into X10, decide that it isn’t flexible enough, decide to completely rewire the house to my own standards. The house that I don’t own yet, and probably never will.
However, I disagree that we are willing to sacrifice simplicity, we want more control -and- more simplicity.
UNIX is simple, not in the sense that the end-user might imagine, not that it’s simple to use, but that it’s simple to understand, and was simple to design and implement. It’s big, but it’s made of lots of tiny, simple parts.
Windows is big, and made of big parts. Big parts that don’t let you take them apart to see how they work.
We don’t desire control, we desire things to be inspectable. We can’t imagine using things without knowing how they work. We can’t believe that people drive cars/use microwaves/browse the internet/etc, without fully understanding what’s going on. How are you supposed to fix it when something goes wrong?!?
Knowledge is power, it’s freedom, it’s fun!
Yes, I realise I contradict myself a bit there.
Where the heck does the “acts like a jock” bit come from? That doesn’t mesh with any geek/engineer/programmer stereotypes at all.
“I’m thinking of UNIX and LINUX in particular. How many years have they had to develop a decent UI? 20? 25? And counting. It’s a hard problem, and developers aren’t good at it. Linux is the hardest of the hard core developers and they arguably suck the most at it. Which is kind of what I was saying.”
Since you don’t mention KDE or Gnome, you must either not know about them, or not consider them decent. If you don’t know about them, then don’t talk about the state of Unix UI until you have looked at them. If you don’t think they are decent, then please explain why, and whether you think Windows and Mac OSX qualify as decent.
congratulations jeff, you’ve provoked the billionth pointless argument about user interfaces on the internet