I'm firmly in the "everybody should learn" camp, but I think the point is being missed by both sides (here at least, and in some of the spill-over discussions I've seen elsewhere).
At the risk of "No True Scotsman", the universalist argument has not been that people can't truly be users (as opposed to "the used", I suppose) unless they know how to program their own computers, or at least write effective macros and utilities, the Stallmanites (both orthodox and conservative) notwithstanding. As Jeff and others have eloquently pointed out, for most people a computer is nothing more than a versatile sort of appliance. And if we in the development community are doing our jobs properly, we are making $person awesome at $task. (I really miss Kathy Sierra.)
Programming will probably always be a specialist venture. We may be able to modularize the heck out of things, but it's unlikely that we'll ever get much beyond the equivalent of a Moog synthesizer. Remember them? You could spend hours playing with patch cords without ever finding a way to change the pitch with the keyboard, and getting anything other than a rudely biological sound out of one required either special knowledge (whether that be book learnin' of the waveform interaction or experience with the synth) or luck. People who are trying to accomplish $task will learn what they need to use to do $task; their productive value lies somewhere other than in building TaskDoer 2100 (which is where our value lies).
There has also been an intimation that people will gain a clearer understanding of the development process, including things like project estimation and programmer husbandry by learning to sling a bit of code themselves. That is both a very long way from the truth and a long way from the universalists' point. In terms of the truth: do you really want somebody basing their understanding of highrise engineering difficulties and construction schedules on knowledge gleaned from building a gazebo during a carpentry night course? Because that's the sort of thing we're talking about here.
No, the true utility of having everybody learn to program (not necessarily code, but program) is to be found in the education of our children. As a side effect, we may find that we increase the pool of potential professional developers that we can draw from, but that, too, is missing the point somewhat. We have a tremendously narrow view of what it is that we're doing simply because the main application of it is accidentally tied up with computers at the moment, and because the bloody-minded literalness of digital computers (or, as Hofstadter's Crab so wonderfully named them, "smart-stupids") is an excellent milieu in which to test what we are really doing. What we are really doing is formalizing process, and that has application just about anywhere you care to look.
There are pedagogical aspects that need careful attention; what is taught and explored needs to be generalized to the wider world. There are just too many things that kids are exposed to in school without ever really learning their applicability outside of the classroom. We laugh at Schlemiel the Painter, but there are people out there in the real world performing strenuous physical tasks with a big-O of n^2 or n! without having any intuitive understanding of the complexity or how to determine whether there might be a better way. And what of those whose job it is to write instuctions for other people? That's not the exclusive province of documentation departments. Having a sense of how things may be interpreted, how things can go wrong, debugging, and so forth certainly wouldn't hurt. You may be able to do a top-down process improvement by bringing in time-and-motion analysts, but you'll get a lot more bang for the buck (and a lot more bucks from the bang) if people working at the pointy end can optimize their own work.
We who are in the biz are going to have to know (ideally) the "multivariate calculus" level of the formalism, as well as the accidental details of computing languages and environments, in much the same way as any specialist needs to have an in-depth technical knowledge of their area of specialty. That's not what needs to be universalized. It's the "arithmetic" level, maybe with a bit of "basic algebra". And while programming a digital computer might be a good vehicle for developing some of that understanding, it's not about getting everybody up to speed in JS/Java/C#. Raspberry Pi will include Scratch; Alice/Looking Glass is similar for kids beginning slightly older. I'm still a fan of Logo and chasing turtles around -- I never actually used a Logo machine, but I grokked recursion after reading about it (and before practical home computers).
Kids who want to go on to "real" programming can, using real-world programming tools, but if everybody had a basic understanding of how to decompose tasks and a feel for patterns of performance (learning big-O without ever thinking of saying the word asymptote), they'll all benefit. Well, unless they go on to study and then teach English Lit or something silly like that.