It can be argued things like spelling are completely arbitrary, spelling is not about intelligence only memory. In English very few rules on how to spell apply globally there are almost always exceptions not even thinking about localization and slang. Surely the computer is freeing our brains for more important things.
The only problem is not learning but also continuity. Nowadays not many people have the skills and resources for creating a product and then starting to mass produce it. For example think corporations that produce laptops. Those corporations have maybe thousands of employees and many factories with lots of subcontractors and their workforce involved. Sure you can help children to learn. But the truth is, that still workforce is needed and continuity. What I mean with continuity is that not much can be done easily from scratch. You need to be part of bigger system that develops over time.
Also, there is no point to produce eg. new super cars, if people are not buying them. People might have some reasons for not buying, like the super cars are too expensive, people want greener cars, people like to use trains more etc. What ever the reasons, in the end, everything is connected now and over time.
If you need it spelled out to you why PHP is not a good language, perhaps you’d best stick with it. Yeah, I’m looking at you.
No doubt Alan Kay is one of the ‘hall of fame’ computer scientists, but the granddaddy of them all, imho, is Douglas Englebart. Most of the PARC stuff derived from Englebart’s work.
Btw, Alan has an interesting comment on users of Guitar Hero, however Guitar Hero has spawned a huge interest in people wanting to learn real guitar. Many of my guitar-teacher buddies are overwhelmed by the number of students they now have, mostly kids who got the guitar itch from the video game.
Sounds like a great guy… certainly has a lot of great achievements that have benefited many of us. I’m curious though how exactly someone can invent 3d computer graphics or GUI though. Both are implicitly required by the nature of the universe… e.g. GUI existed when the first computer was switched on. It had to output to be useful, and thats a crude GUI. as for 3d graphics, that has been around for over 2000 years (the theory behind the implementation), whilst implementing it isn’t exactly rocket science either (a rotating wireframe cube was one of my earliest BASIC programs, and I intuited the whole thing with less than high school education [yes even the perspective transformation, no i didn’t use rotation matrices]).
I’m not trying to say I’m special, just that this stuff is stupidly easy compared to some of the other things mentioned, e.g. implementing an OO language (I assume this is what is meant as the basic idea itself is, again, trivial).
I would agree that most of our software is not that great, but I think the real impediment is that the average user does not code… my ability to code increases my productivity in my regular deskjob by a lot…
Throughout the years I’ve really really enjoyed your posts but… you are almost losing a reader here
aww… suck it up kitten.
I guess lots of people are getting army-itch from WW2-games and such…
@Jeff: You really should stop blaming the tools, which include programming languages, for horrid code and shoddy application design.
PHP didn’t write that code. PHP didn’t design those applications. PHP does not act on it’s own. PHP is not a person.
People are the only ones to blame for bad code and people will always find a way to write bad code in any language.
Jeff, I enjoy your probably. I have been a professional developer for less than 5 years, and am still (and hopefully always will be) hungry to increase my skill and understanding. Many of your posts have been thought provoking and informative. I know you sometimes get a lot of flack, but I appreciate what you have done. This was a good post. Thank you.
Thats why the Army has built its own War video game.
On the topic of programming languages…
One of the main threads in Alan Kay’s work is that computers be thoroughly user-programmable. We have really moved away from that in the past few decades, though there are some aspects which still or until recently remained user programmable (shell languages, AppleScript and HyperCard [r.i.p.], macro languages in apps like Office, embedded scripting languages in some games, lots of stuff on the web.)
I’d love to see a ground-up redesign of a PC that simplifies the OS to a large degree, and in which users can really directly manipulate how the software works, through easy to learn programming/scripting languages, or something like visual programming, or a combination of these things.
(The Xo/OLPC is in many ways this thing, no surprise it’s built around Alan Kay’s Squeak environment.)
Have you read Dealers of Lightning? It’s a history of Xerox PARC. I think you’d enjoy it.
We are without doubt standing on the shoulders of giants. Those who compile today might well wonder if they were also capable of writing a usable compiler. If they pass that point, they may also wonder if they were capable of inventing the idea of compilation. I suspect the honest answers are as you would expect.
Mike Swaine published a wonderful interview with the late Bob Bemer in DDJ about eleven years ago. Bemer invented ASCII, and maybe typecasting and probably a bunch of IBM mainframe-specific stuff.
Where would you be without ASCII, or typecasting?
A technical biography of Bemer, Hopper, Brooks, and the other IBM’ers of that era would be most welcome.
Thanks for another great post.
Thanks also for turning me on to Jeff Moser’s blog – yet another repository of fine reading material to fill my busy workday
99.9999999% of the time this is from Windows XP users who have ClearType disabled
By gum, you’re right. I don’t recall disabling it though… ever.
I learned something today.
I learned how a cheap shot attack on dead horse issue like PHP can ruin an otherwise enjoyable post. But it really doesn’t matter.
Regarding computer usefulness versus entertainment - I believe the inventors of television had a similar complaint. The TV was envisioned as a great tool for long-distance learning, but devolved into mainly an entertainment box. In many ways the computer has had a similar de-evolution.