To Serve Man, with Software

I didn't choose to be a programmer. Somehow, it seemed, the computers chose me. For a long time, that was fine, that was enough; that was all I needed. But along the way I never felt that being a programmer was this unambiguously great-for-everyone career field with zero downsides. There are absolutely occupational hazards of being a programmer, and one of my favorite programming quotes is an allusion to one of them:

This is a companion discussion topic for the original entry at
1 Like

Jeff, this echoes something I’ve said many times before: the world doesn’t need more programmers, it needs better programmers.

I didn’t originally intend for the framework I built to be a teaching tool. Over the past several years though, I’ve see people come into our chat room with the same major gaps in their understanding of modern software design principles and tools. These aren’t newbies to the profession, either - many of them have been working as web developers for a number of years, yet they don’t understand something like SOLID design, why they should use a package manager, or the anatomy of an HTTP request.

How does this happen? Is is the fact that programming is such an accessible profession, requiring only a computer and an internet connection? Unlike other technical vocations, it’s possible to get started without years of schooling and a job that provides access to tools and equipment. Combine that with a landscape of code snippets floating around (not just Stack Overflow, but it is a major culprit these days), and many people can see tangible results without ever needing to really learn anything. Doesn’t this reflect badly on our profession as a whole? Most people who aren’t programmers, can’t tell a good programmer from a bad one.

I would agree with you, that Stack Overflow has certainly helped me become a better developer. But, that’s because I’ve chosen to think critically, answer a lot of questions, and take the time to learn more about the assumptions and technologies that underlie a given question. What about the developers who just show up because “gimme teh codez”? Are we just enabling bad habits, and helping to proliferate bad code?

What about when someone’s bad code becomes another developer’s problem? The DestroyBaghdad method is a pretty clear-cut case of code doing bad things. But what about more subtle cases of “software development malpractice?” We’ve all inherited that giant ball of spaghetti code before, the one that leaves us with the impossible choice of working with it to implement new features, or forgoing adding new features to rewrite it (and infuriating the business folks). Should the original author of the project take responsibility for the mess they left?

It seems to me that with the proliferation of information, has come a commensurate proliferation of disinformation. SO and web tutorials are littered with SQL injection bugs, and new developers are learning from these. I’m doing my part to educate others, but how can we tackle this in a scalable way?


Great end-of-year post and I will surely watch that series, but I would also remind of the 1973 book: Small Is Beautiful: A Study of Economics As If People Mattered, that not only predict bad outcomes of technology, but also support this with back-of-the-envelop calculations. It is/was on the list of 100 most influential books. Info on wikipedia and google books. Now, so many years later, we see how accurate the book actually is on that technology worsens the discrepancy between the poor and the rich, and does little to remove famine. Happy 2018!


Fair warning: please DO NOT start with season 1 episode 1 of Black Mirror!

I started watching with my wife after hearing about Black Mirror and she refuses to watch any other episodes. I can’t say I blame her.


well, software plus screens/devices can be seen as a thoughts sharing medium, like a paper but it can compute things and change what it displays/does. So it can display/do only what other people on the other side of wire had put into it. It means then that programmers (people who put thoughts into computers) should definitely grow themselves to have better view of the world and life. Then computers and software will serve rest of the people in need in a much better ways.

To be fully human, it would seem natural that we help each other. It’s good to see good people doing good works.

1 Like

Can’t say I agree with Jeff here, the first 2 seasons are the only ones that had something to say before the series just devolved into a “big twist” focused tech inflected twilight zone IMO. The third and fourth episodes in particular are key episodes. The first episode might not be for American sensibilities but it also is a really valuable commentary on the unfathomable power of social media.


A bit relevant: there is quite a number of free games, that have very low hardware requirements, and that teach programming. I personally played RoboZZle, which is now available for every major mobile and desktop OS by various vendors:

DestroyBaghdad is fine in certain contexts:


As a standalone procedure you can tie an event to it without having to worry about providing a parameter.

Also, over the years I’ve written plenty of wrapper functions that simply call an underlying function with common parameters.

Also, what if you’re writing a video game in which only Baghdad could possibly be destroyed?

That doesn’t mean the game won’t change or have new features (ie: destroyable cities) added in the future long after you’ve written that code; do you want to have to do extra work when the time comes to extend that function?

Suppose the game is set in Baghdad. You’re trying to stop the bad guys, they are trying to nuke the city. Failure and the nuke goes off. (Think of something related to the original XCOM–you’re trying to beat down the bad guy’s organization, they’re trying to build it up enough to bring the bomb in.)

Suppose a DLC is released for the game that adds a new city to save from the bad guys with nukes (say… London), now what?

I wouldn’t expect a DLC to add a whole new world to a one-world game.

I’d ask why, but it’s irrelevant and you’re missing the point.

The discussion thread reminds me of-- Has anybody noticed that building before?

1 Like

this makes sense, good call on the black mirror scenarios, its interesting how software engineers are their own worst enemies, im challenging myself, to learn more on development in software and also my own flaws to become a better developer. i always have the issue of jumping into something trendy thus making me not focus on whats at hand, i want to be more than a coder, build better solutions. Great piece. your blog is important to developers.

Fair warning: please DO NOT start with season 1 episode 1 of Black Mirror!

What’s wrong with season 1?

Episode 1 has a bit of a :pig: issue. Lots of people can’t get past that episode and I don’t blame them. Start with season 3! Which also has at least two of the best episodes in the series.

I’m new here. What does it mean to have a :pig: issue?

I had the exact same experience – in light of Jeff’s advice, I might try to get her to give it another go.

Funny enough, the same thing happened with Star Trek TNG. Hadn’t seen it since I was a kid, wanted to show it to my kids, and the first few episodes were junk. Just like this, I saw an article online shortly thereafter that said to skip ahead to a particular later episode (somewhere in the middle of S2 I think?) and just pretend the earlier ones never happened.