Jeff, this echoes something I’ve said many times before: the world doesn’t need more programmers, it needs better programmers.
I didn’t originally intend for the framework I built to be a teaching tool. Over the past several years though, I’ve see people come into our chat room with the same major gaps in their understanding of modern software design principles and tools. These aren’t newbies to the profession, either - many of them have been working as web developers for a number of years, yet they don’t understand something like SOLID design, why they should use a package manager, or the anatomy of an HTTP request.
How does this happen? Is is the fact that programming is such an accessible profession, requiring only a computer and an internet connection? Unlike other technical vocations, it’s possible to get started without years of schooling and a job that provides access to tools and equipment. Combine that with a landscape of code snippets floating around (not just Stack Overflow, but it is a major culprit these days), and many people can see tangible results without ever needing to really learn anything. Doesn’t this reflect badly on our profession as a whole? Most people who aren’t programmers, can’t tell a good programmer from a bad one.
I would agree with you, that Stack Overflow has certainly helped me become a better developer. But, that’s because I’ve chosen to think critically, answer a lot of questions, and take the time to learn more about the assumptions and technologies that underlie a given question. What about the developers who just show up because “gimme teh codez”? Are we just enabling bad habits, and helping to proliferate bad code?
What about when someone’s bad code becomes another developer’s problem? The
DestroyBaghdad method is a pretty clear-cut case of code doing bad things. But what about more subtle cases of “software development malpractice?” We’ve all inherited that giant ball of spaghetti code before, the one that leaves us with the impossible choice of working with it to implement new features, or forgoing adding new features to rewrite it (and infuriating the business folks). Should the original author of the project take responsibility for the mess they left?
It seems to me that with the proliferation of information, has come a commensurate proliferation of disinformation. SO and web tutorials are littered with SQL injection bugs, and new developers are learning from these. I’m doing my part to educate others, but how can we tackle this in a scalable way?