Are All Programming Languages The Same?

There's a chart in Code Complete that compares the productivity of working in different languages:

This is a companion discussion topic for the original blog entry at:

So if a language had a library that contained a single function such as:

public void ReadFileAndWriteToConsole(string fileName)

Then it would be considered to be a much better language than any other based on line count? Hmmm… there is still something wrong with your metrics (and obviously your example).

Your point about the fact that “the main work loop, if considered alone, is almost identical in every language” is a good one. However, it seems to point to the fact that newer languages have more robust libraries than just the original C language had. Not that the languages themselves are that much better. So where do you draw the line between comparing “languages” and “librarys”?

The first one yields 5 lines, and the second yields 4. If LOC is ever used to measure productivity or efficiency, breaking to the next line should not be awarded/penalized for the programmer’s coding style

I realized after I posted this that the counting methods used in Des’ post ignore “do nothing” lines, so the C# curly positions don’t matter and I shouldn’t have bothered.

This also means the ending clauses in VB aren’t counted. In “If… End If”, only the If line is counted.

Your point about the fact that “the main work loop, if considered alone, is almost identical in every language” is a good one

In every MODERN language eg not C or C++. I assume Java will soon add, if they haven’t already, a File.ReadAllLines() method.

So, like you were saying, we have a lot of libraries. Does the language glue matter that much any more?

I’ve never really understood this whole flap. If Perl or Python does something with fewer lines of code, then it’s because there’s a library somewhere that wraps up a bunch of primitive operations into one function call. You could write this in whatever language you’re using and then you’d have the same productivity. The thing that makes C painful is the memory issues, but you can even make a lot of this go away with good libraries…or smart pointers.

Assuming equivalent libraries, I honestly think the thing that contributes the most to productivity is familiarity with the language and libraries, and things like Intellisense so you can know what everything does even if you’re not that familiar, or you’ve forgotten.

“Priest of BOB” posted this to a different topic but it belongs here:

open FILENAME, $filename = '' or die 'foo';
print (@bob = );


if you’re worried about an exception while reading, then throw an eval around the print.


So, Josh, if I understand you correctly, you’re saying that the IDE is more important than the language?

I agree…

Hi Jeff,
Thanks for the trackback. I’ll try to answer/defend some of what I said.

To Marty, Yes I would never count a bracket as a line of code (unless I was being paid per line of code of course)

To Matt, obviously as I pointed out, this was not a rigorous scientific analysis of languages. The IEEE paper has that if you want it. I agree that the work loop is very similar in all languages, its the amount of Object creation/initialisation sit-ups that you have to do that weight down the OO languages.

To Josh: I agree, but I would say that perl/python provides libraries written by hackers for hackers, and as a result they tend to handle many things in a far more concise manner.

Just for reference, I am a PhD student researching Computer Science Education. Many questions I regularily deal with are along the lines of “How many languages should we teach”, “what is the best language to teach first”,
“what is the one language all students should now”
“Should we teach the libraries or just the language”.

Many of these are cut and dry issues (for me at least), but I must say I find the notion that the IDE is more important than the language very interesting. Something I will think more about.

I must confess I only starting reading this blog today, but I will stick to it. It is very interesting stuff.

In defense of the OO languages, their strength is their ability to handle complexity, not their brevity.

Given - “the abandonment of C/C++ for mainstream programming”.


Name any OS which isn’t coded in C/C++. I mean, a real one.

Name any Office package which isn’t coded in C/C++. I mean, one with measuable market share.

Name any database which isn’t coded in C/C++.

Name any X which isn’t coded in C/C++. Where X = webserver, application server, financial application, image analysis package, etc.

I think your definition of “mainstream” must be different from mine, because from my point of view EVERY mainstream program is written in C/C++, and nothing is even close.

Someday this may change - when pigs fly, maybe, or when hardware is SO FAST that performance doesn’t matter - but that day is not today.


I’ve tried python for hobby code while using C/C++/C# at work. There are pros and cons to using a scripting language such as python.

Pros: It is indeed much faster to code and test most practical apps in python. It’s not just LOC. Perhaps it’s the lack of the compile step, just code, run and test, repeat until done, a very significant productivity gain. Or maybe it’s because the libraries are much more varied, there’s always a python lib out there that supplies functions for what you want to do. It could also be the little stuff. For example, to return a set of x,y coordinates in python, you just do it:

def func1():
  x = 1
  y = 1
  return (x+2, y+3)

(a,b) = func1()

In C++ or C#, I would need to declare a struct or class in order to return the coords. In general, there’s less anxiety while coding due to the loose typing and such.

Cons: There are significant negatives which are enough to make me shift back to C++ or C#. The main problem with python is the lack of documentation and a good GUI lib (wxWidgets doesn’t cut it). Sometimes when developing apps you want to put that extra 10%, whether it’s extra UI trimmings or perf improvement, or something. And I found myself stuck with not knowing how to do it in python due to the lack of docs. In C++/Win32 or C#/.NET, there’s always a documented way to get that extra 10% which makes all the difference.

So I would say, for quick programming jobs, consulting tasks, etc. a scripting language such as python would do the job well. This would probably be the majority of all practical apps. But if you’re developing your own shareware or shrinkwrap products, better do it in a well-established environment such as C++, C#, or VB.

Complexity measures like Function Points are probably more relevant here.


$a = "2";
$b = $a + 3;

is less complex/has fewer FPs than the equivalent C

char* pA = "2";
int b = atoi(pA) + 3;

especially if you count the type declaration as a FP…

Scripting wins by making the environment smarter, at the expense of predictability.

$a = "abc";
$b = $a * 3;
# returns "abcabcabc" 
$x = "69";
$y = $x * 3;
# y = "696969" or 207 ?

To respond to Jeff, the equivilent entire perl “script” would be:

#! /usr/bin/perl -p

…which I’m not sure if you’d count as zero lines of code (perl is infinitely better?;), but in general I’d agree with Ole probably 99% of applications I work with are in C or C++[1] … the last 1% are some form of quick automation script, and a very few are desktop apps. that are in C#/Java/Python … and with an average developer class workstation I can often still spot which are in the later category by their startup time, not to mention that something always goes wrong with them everytime their runtime changes (with Java being the worst, and python being the best … but there isn’t much in it).

[1] This is different if you talk about webapps. … but IME these are often written by morons who should be required to write them in ASM as revenge.

Exactly! If you’re talking about productivity (and not, say, extracting the last microseconds of performance for some space probe), then I think the biggest bang for the buck is in the IDE, not the syntax.

Ironically, I read something by VB god Matt Curland that said that in this respect VB is better than C/C++/C#, because the verboseness of the syntax made it much easier for them to create Intellisense for. Can’t find the link just now.

Mogensen said
"Scripting wins by making the environment smarter, at the expense of predictability."

That thing with lack of predictability doesn’t fly, at least not with your example which rather displays a lack of Perl knowledge :slight_smile:

The first example does not result in string concatenation (should have used the “x” operator, not the “*” operator for that), but a numeric operation (so $b contains 0). That’s basic to how the Perl type system works. So in this case it’s predictable if you know the language :slight_smile:

  • I perceive the examples in this post as too simple to take any conclusions from
  • Having programmed in c/c++/java/python, to my personal experience java is 2x more productive than c/c++ and python is 2x more productive than java
  • A top programmer can be 10x more productive than an average one, allthough this is off-topic, the discussion is about programming languages ?
  • Libraries (provides by the language, 3th party, programmer himself) are an important factor (that’s why i find jav more productive than c/c++)
  • I agree with Brandon that a line with a bracket is a LineOfCode, except for python :wink:
  • The larger the project (also off-topic:-) the less impact of programming language productivity on the total project productivity. Large projects eat a lot of resources on other activities besides programming: rigorous requirements management, functional spec for interface definitions, detailed project plans with critical path identification, subprojects coordinated by a team of leaders, integration testing … See also “Managing Complexity and Uncertainty” on my Software Quality weblog (click my name below).


by the way - what tools do you use to count LOC for the different languages ?

Yes, but… all other things being equal, less LOC is generally better. That’s fewer lines of code to debug, to understand, etcetera.

It’s just a general guideline, of course, not a strict rule.

I can’t help but think that all this LOC stuff is just a load of bollocks. For productivity to be affected by the amount of typing you do, you have to be typing an awful lot. All the studies I’ve ever seen suggest that the average programmer’s number of correct lines of code produced per day is in the order of 20 (lines that actually do something, not counting autogenerated lines, eg by IDE form designers, or extra syntactic fluff).

20 lines. Per day. But you spend all day typing, right? No you don’t. And when you are typing, you aren’t always writing new code, are you? No.

If conciseness was the ultimate measure of productivity we’d all be using Forth or APL.

Scripting languages may be all very well for trivial applications, but most applications are huge. It is far more important that our code be readable (by others as well as ourselves). Code that is easy to understand is much more productive in the end, no matter what the language. OO languages tend to score better when it comes to features supporting programing in the large.

Examples like the one given here lead only to pointless arguments over people’s favourite language/syntax/code layout/naming conventions. It is completely irrelevant to a discussion of productivity.

Name any OS which isn’t coded in C/C++. I mean, a real one.

the web. the web has basic i/o, file storage, apis etc. it is an os, it abstracts complex systems. it is coded mostly in perl, python, php, etc

Name any Office package which isn’t coded in C/C++. I mean, one with measuable market share.

no one cares, its not 1992 anymore

Name any database which isn’t coded in C/C++.


Here’s how I would code the example in Common Lisp:

(with-open-file (s "rl.lisp" :direction :input)
  (do ((line (read-line s nil :eof) (read-line s nil :eof)))
      ((eq line :eof) (values))
    (format t "~A~%" line)))