Continuous integration is rather new but it’s objective as to wether or not it’s necessary. For big projects you may shoot yourself in the foot without it but for small projects it tends to be overkill.
I don’t think we remove anything from the lists. Other than source-control, I have seen every rule violated at one time or another in the last 5 years.
I agree. I am inclined to add this to the list. But I am wondering if, like Continuous Integration, this is too specific. Clearly unit testing alone isn’t enough; what about other kinds of tests, such as code reviews? Why not single them out as well? McConnell’s “do you have a quality assurance plan” might cover it.
Other than source-control, I have seen every rule violated at one time or another in the last 5 years.
Well, at least we got source control going for us. Although it’s debatable whether Visual SourceSafe is a good or bad influence on developers.
Do you follow agile methodologies
OK, but what does “agile methodologies” mean? Be specific.
what core set of practices constitutes modern software development…
Good communication skills.
I’ve met many developers that are good at coding and making things go, but when it comes to communicating they just fail. It doesn’t matter if they’re communicating with customers, other developers, or bosses…they just don’t do it successfully and that puts the hard work of others, the project, and themselves in jeopardy.
I mean, e-mails, phone calls, person-to-person - they’re just not people persons, which is fine and all, but to me, being a developer means communicating clearly and confidently so the person you’re talking with understands everything they need to – and so you understand everything to finish the job well.
Don’t take “best practice” lists seriously. Come on. In modern software development, aren’t we supposed to think for ourselves? Is that an idea STILL before its time?
Do you have a way of matching what your final outcome to your original goal?
Unit testing is one step along this path.
Requirements tracability is another.
Different SDLCs meet this criterion in different ways but all should meet it. (Hands up whose team does NOT have a formal SDLC - defined and agreed to - whether XP, Waterfall or something in between).
Note that this is NOT just quality assurance. QA ensures that you have a good product. It does not ensure that the product does what it is supposed to.
First of all - great blog, Jeff, big props! I’ve never made a comment before, but I thought I’d make one now :).
I’ve extended Joel’s list a while ago myself and it caused quite a stir after I posted it to one of the forums. Here’re the additions:
Do you write unit tests for most of the code?
Do you use unit test coverage analyzer?
Do you use automated code analyzers?
Do you conduct code reviews on the regular basis?
Do you use version control for all documentation?
Does your development lead/architect write code?
Do you refactor existing code?
A lot of it is arguable, most folks didn’t like the two last points the most. But from my experience on the variety of projects, they all make sense.
My favorite definition of a “legacy system” is “a system without automated unit tests”. After spending my entire working life (nearly 25 years) in software development, and having worked for over 3 years in the IT division of a huge organization on one-million-line projects that have virtually no automated unit tests, I can confidently say with no doubts that automated unit tests are an absolute, total, unqualified, 100% must-have for the medium- to long-term viability of a large project. There are many reasons for this, but the most important one is this: I have seen many developers in this organization almost tremble in fear at the thought of having to make a change to production code, lest it result in an expensive and high-profile production outage. The lack of unit tests to catch functional regression is so fearsome that teams have written totally new systems rather than try to work with existing ones. Of course, said teams write legacy code right from the start because they still don’t create unit tests, so their successors will throw away their code, too.
I cannot even begin to estimate the massive waste of resources, the frustration, the stress, the long hours - all totally unnecessary - as a result of not having automated, properly designed and executed unit tests. I’m not going to go into what constitutes a proper unit test - there’s not enough time for that and there are plenty of resources out there.
What many people don’t realize is that the mere fact of having thorough unit test coverage ensures that you have testable code. Testable code is in general better designed and more flexible than the alternative. It has to be by the very nature of testability, which requires a great degree of decoupling and reusability (namely, the code needs to be “reused” by the test cases).
In addition, unit tests help to answer many questions about what the code is supposed to do. The reality of many large IT divisions is that there is never adequate documentation, and there are literally millions of lines of code that people simply don’t understand and are afraid to touch. A decent unit test suite would go very far towards compensating for the lack of other information about what all that code does.
However, from what I have seen, the likelihood that the corporate IT developer will create anything approaching good (or any!) unit tests approaches zero. Sorry to be negative, but I have just seen too much (or, perhaps I should say, too little). Quite frankly, I have come to believe in Sturgeon’s Law, which I will paraphrase: 90% of software is crap, because 90% of everything is crap. Have a great day.