All Programming is Web Programming

“Should all applications be web applications? Of course not. There will continue to be important exceptions and classes of software that have nothing to do with the web. But these are minority and specialty applications. Important niches, to be sure, but niches nonetheless.”

I wouldn’t say non-web code is the minority by any means. Just looking around my office I see an iPhone (20 million?), and iPod (hundreds of millions) a Cisco IP Phone handset (millions?), a UPS and a digital watch. All of these have embedded code - some of them simple, and some of them many times more sophisticated than your average web application.

While programming may be done on a computer, it has a much larger market than things that are used on a computer or things that run in a browser.

These past articles are causing me to wonder what small corner of the world Jeff lives in. I have seen and worked on many “web” applications that are nothing more than “desktop” apps that can be viewed over an http request.

I think I keep coming back here for the same reason I can’t seem to not stare at car accidents.

This blog is a wreck.

I’m not a genius, but I am pretty sure that there are just as many terrible desktop applications as there are Web Apps! Look at gaming, games like Line Rider, which are flash based web applications are FREE and more enjoyable(and played by a lot more people) than say, Daikatana, which was a game designed for a desktop PC. Daikatana was made by some of the “smartest” people in Game Development. Sometimes a good idea, even as a poorly implemented web application, will be better and more successful than a shrink wrapped and expensive desktop app!

I fear to see Emacs as a web app. A JavaScript ELisp interpreter … scary…

Gah - no one will ever read a comment this low - (I didn’t even make it) but…

Mike’s argument is entirely valid (ignoring the bit about dumb software developers). The way things are now “serious” programmers have a huge wall to hurdle if they want to be web developers. The internet is an INSANE environment for people who want to do serious work, and not get drowned in the tedium of cross browser development, UI design, and nit-picky browser speedups.

Things are significantly better these days then they were a couple of years ago. New practices in javascript (at least for me) have made things significantly more sane. JQuery is a great step in the right direction, but the DOM is a mess. (Incidentally I think that John Resig is becoming a personal hero… I haven’t bought any books or anything, but he seems to be a driving force for making things better)

The fact that Mike is probably missing is that in five or ten years when he’s finally forced to make the jump for one reason or another, the web should be a PLEASURE to program for. Things are clearly moving in that direction. We’ll have the good development tools that he’s pining for (a lot of them are already here in bits and pieces).

In a few years this debate MUST be moot, the differences between web programming and application programming need to be fixed - precisely because application development is better for all the wrong reasons.

I’m assuming this website isn’t hosted and served by a web app.

I don’t know about you, but features and performance are two things I appreciate in most applications. I don’t know if the two can possibly coexist in web applications, but for the most part neither are present anyway.

The author’s comments in his blog post use terms that can be a bit confusing. I think it’s fair to say that what he means is that he doesn’t want to work with browser based apps. The author is obviously going on a rant the developer experiance, not the “web app” platform.

The author is right in that a developer has to sacrafice so much of the developer experience in order to develop a rich user experiance in a browser. Javascript is an expressive language, but it is also closely coupled to a browser. This makes unit testing a nightmare for browser based apps that are implementing complex business rules. Not to mention it makes a lot of stakeholders pretty damn angry because they don’t understand that you are basically developing in notepad.

The amazing work that google is doing in browser is in java, not javascript. Google went through the headache of developing GWT because javascript becomes the ulitmate, expensive maintanance nightmare in LARGE applications.

In terms of the devs that work on “web apps”, many devs start out as designers rather than programmers. They are very comfortable with using a closely coupled language (to the browser) with no type checking and no code navigation. Programmers want to be able look at object structure in seconds, and go to the definition of a function by a single keystroke. That’s why many programmers don’t like javascript.

Is everything going to the “web”? Absolutely, but hand written javascript + HTML not only provides less of the rich UI experience as desktop based RIA’s but for large apps it’s expensive, more error prone, and more limited to what you can do with it.

@Brian Carson

I think you are responding to the article he is quoting. I believe Jeff would agree with you.

Jeff, you are blinded by your own experience. Not all programming will be web programming. There is no way that vision systems in trains measuring rail profiles in real time for defect, or real-time trading apps or other control systems will be “web apps”. You’re off your rocker.

The popular super-hyped “social” apps will of course be be web apps. I suppose all email and maybe desktop publishing will migrate. But, as with the over hyped and short on delivery web appliances and “set top boxes” showed, we’re still a long way from world dominance by the web.

Frankly, having everything controlled by the interwebs is scary to me. There is so much more bad programming in that paradigm than in other more structured fields. Being web based is, at the same time, more robust, yet also open to many more vulnerabilities.

Stick to what you know - both you and the poster you are responding too are just slinging arrows and rehashing the “VB vs C” coding argument.

Wow, I don’t think I’ve ever seen so much chest puffing and dick waving in my life. When did programmers become so freaking insecure?

@D

Maybe it’s the economy.

Anybody who has used a web app on an iPhone knows that a web app is no substitute for a “real app.”

In fact, I have yet to see a web app in a desktop browser that comes close to being as nice as a native app. Web apps may be acceptable, but they all suck, and will continue to suck.

I don’t think non-web apps are on the way out. However, I do think desktop apps need to evolve.

I would like to see desktop apps using the good ideas of web apps. The App Store is a good example of this. The app upgrade process is less painful than a standard desktop upgrade (although a small amount of user interaction is currently still required). Linux software management apps (apt, yum etc) are also a step in the right direction.

Multi-touch on the iPhone is an example of something that web-apps can’t (will never?) be able to embrace.

Another reason why I don’t think desktops apps are doomed is because of business. I can’t image there are many business that would trust ‘the cloud’ with all their precious data.

There is a massive failure of education.

I am the web developer Michael Braude hates – or at least I was. I had no formal background in CS. I morphed from being a designer to a coder in the ugliest possible way. You all know the story: cutting and pasting JS, etc. Worse: I thought I was good at it and became a teacher, explaining to other people how to write spaghetti code.

The thing is, I had no way to know I was engineering things badly. Naturally, my crappy code came back to bit me in the end, but I just figured that happened to all programmers.

I didn’t know what books I should read, what subjects I should learn, or how to improve. Having read “Javascript for Dummies,” I figured I was an expert.

Luckily for me, I’m Mr. Self Improvement, so I stumbled around until I did find resources that could help me, and after years of reading books like “Code Complete” and “Refactoring,” and after being mentored by some experienced developers (who were also good communicators) that I just happened to meet, I can actually write code that doesn’t make me (or my co-workers) shudder.

I’M THE NORM. Many, many developers are not going to go through formal channels. They’re just not. We need to get over that, accept it, and start working to better the situation given that it IS the situation.

As I see it, there are two major things new developers need help understanding:

  1. How to write good, clean code.
  2. Why it matters.

(Their managers also need help understanding #2.)

If beginners don’t understand these things, that’s not their fault. It’s the fault of an industry that thrusts newbies into the ocean before they know how to swim. It’s the fault of companies like Adobe who pitch a message of “anyone who knows how to use a word processor can be a web developer.” And mostly its the fault of senior developers for not effectively passing on their knowledge.

If there was EVER a field that needed a mentorship approach, it’s programming.

I’m as much at fault as anyone. The irony is that I used to teach back when all I could do was make things worse. Now that I actually (somewhat) know what I’m doing, I’m not teaching any more.

@D

About the time the walls around the magic kingdom came down, allowing pretty much any layperson with a passing curiosity in the workings of the mystic machines and arcane spells within was free to learn about them.

By the way, there are a lot of terrible programs written in C, C++, Java, etc.

It’s amazing how many of the comments here are criticizing Michael Braude’s article, often in quite personal terms. What’s the point? If you disagree with Braude, why not go to his blog, read the full article there, then explain to the author why you disagree with it? Attacking him here is just cowardly.

Even more amazing is how many people don’t seem to realize that Braude didn’t write the Coding Horror post, and that Jeff is actually disagreeing with it.

Not one whole day after broadcasting “PHP is the next COBOL,” we have this. Not so much fun being derided, eh, Jeff? Every social group has some sort of stratification, I think it looks something like this:

“Neck Beard” Level
Assembler, Embedded languages, C, C++, Languages missing GC (Look at me, I do pointers!)

“Professional” Level
Objective C, Java , Python
Lisp, Haskell, Erlang
Any language the neck beards deem worthy.

“Noob” level
Ruby, ASP, PHP, JavaScript, ActionScript

I have experience with C, C++, Java, J2EE (what a mess), Haskell and Python. After graduating with a degree in CS I was attracted to PHP and JavaScript, for 1. The immediacy of seeing a result in your work, 2. Untyped collections (array() in PHP, [], {} in JS) which make everything less painful (no more UserAccountCollection classes), 3. The job opportunities that would allow me to build interesting web applications.

Does this make me a bad programmer? Do I, should I care? I’m doing what I enjoy, I’m working with technologies that have quirks, but every piece of tech always has some bugs or issues. No language is perfect, no programmer too good. Obviously we aren’t above squabbling about inconsequential nonsense, like language/platform preference, so none of us is perfect.

I’m not sure if he’s back-tracking or just making a clarification, but Michael states the following in a comment on his own article:

“I don’t consider server-side programming ‘web programming’ because SOA encompasses all mediums. Once you get into writing web services and DAL’s you’ve left the world of aggravation.”

That should be a bit less inflammatory, especially to those who would be reading this blog.