What's the difference between a programming language and a scripting language? Is there even a difference at all? Larry Wall's epic Programming is Hard, Let's Go Scripting attempts to survey the scripting landscape and identify commonalities.
This is a companion discussion topic for the original blog entry at: http://www.codinghorror.com/blog/2009/01/a-scripter-at-heart.html
@chromatic if it’s not officially out, it’s not out.
IMHO, Scripting languages are a subset of programming languages so the debate is moot.
It used to be that scripting languages drove other executable programs to get work done rather than doing it themselves - think about MS-DOS batch files, AmigaDOS script files, Unix shell scripts - those all call external programs to get work really done (but conversely, due to I/O redirection make such interfacing natural and almost part of the language). Of course, this definition isn’t so useful or clear these days - but it’s a starting point.
(I do, however, require turing completness, so I don’t count HTML!
I would argue that the difference between scripts, and programs, is in their scope. A script is any program (any language) that does a single simple task. A script may also be any code that is started and terminates every time it is used.
An application might be something that does more than one thing, stays in memory over many uses, handles many user inputs, and needs to be more stable. Thus a script has no input other than command line.
A script can be written in any language, even in C/C++. The choice of language should not be because it compiles or not, it should be based on the merits and abilities of the language.
Here are just a tiny percentage of the applications on C/C++'s resume:
- All major Internet search engines (Google, Yahoo)
- All major Microsoft/IBM/etc. applications.
- Nasa’s Jet Propulsion Laboratory
- All major operating systems (Windows, Unix, Linux, Mac)
- All major web browsers (IE, Firefox, Safari)
- All major web servers (IIS, Apache)
- All major databases (Oracle, SQL Server, MySQL, etc)
- All major word processors (MS Office, Open Office, WordPerfect)
- Most major development tools (Visual Studio, Rational Rose)
- Most commercial shrinkwrap games (Grand Theft Auto, World of Warcraft)
- All major 3D software (Maya)
- All major image editing software (Adobe Photoshop, Gimp)
- All major sound production software (SONAR, Pro Tools)
- Most voice/image-recognition software (Dragon)
If these types of applications aren’t your thing, that’s fine, but let’s give credit where credit’s due. C++ has succeeded for a reason, and that reason is not that programmers are masochistic elitist assholes.
C programming is neither harder nor better than (for example) .NET programming. If anything it’s easier, because it has fewer parts.
If you can deal with all the complexities of a feature-rich language like C#, you can learn C in your sleep.
And that, Jeff, is why I’ve never quite understood your slightly-but-not-quite hostile stance towards C/C++. This is not an argument you need to make, it’s not an argument you can win, and it’s not an argument that would benefit you if you did win it.
So what gives?
These kinds of arguments about irrelevant distinctions just invite holy war. Why don’t we argue about which programming language is superior?
Here’s Stroustrup’s list of major C++ applications, interesting reading.
Forth is the one true language that all others aspire to, because it encompasses everything from assembly code up through compiled and tokenized representations, to high-level interpreter.
If you really want to be a programmer, write a Forth system. In Forth. Then port it.
I’m a programmer at heart - while I have some ability to work with scripted languages, I’m used to the compiler picking out my errors as I do everything at once.
You’re a real programmer if a comment says why this is done (e.g. you can explain why that piece of code is there). it doesn’t matter which language or interpreted or compiled.
How many users actually use your application? Now that’s the ultimate metric of success.
So this how it went:
- Add a few statements to a script file in an attempt to fix a report bug.
- Launch the app in debug using a custom-made debugger.
- Wait 10 minutes to get at your first new statement.
- Oops! THERE’S A TYPO IN YOUR CODE!
- Fix typo.
- Repeat steps 1 to 5 three more times.
- Start actually running your code now that it’s typo free.
- See that your first naive attempt didn’t quite do it.
- Rework new code.
- Repeat steps 1 to 9 three times.
About 10 to 20% of the development time was wasted fixing stupid shit a compiler would’ve caught. Yes, that’s at least one entire afternoon per week of pure time wastage.
I realized then that the only people who would use JS to write a PC app would have to be RETARDS!
(Yes, I’m coming off harsh. But so would you if you had been there.)
I’m glad you wrote this piece. Because I’m a scripter too!
I followed long way from Fortran, Pascal, Delphi and Java to finally discover Python. And you know this strip from XKCD, with import antigravity? That was my moment of relief 3 years ago.
Quote What happened next was the eight unhappiest hours of my computing life.
Hahahah, eight hours? Talk about it, I have a similar story, although I learned C later on a 486, I learned assembler on Amiga when I figured I wouldn’ be able to make games and demos in Amiga BASIC.
So I spent a WEEK running a tiny program (10 lines or so!) which sole purpose was to access one standard library and output a simple hello message in the CLI.
For one week I kept looking at these 10 lines trying to figure out what wasnt’ working. Eventually, I realized the name of the library in the string didn’t use the same casing as that in the book. Then, the program ran… and I cried! Never cried since for programming (just pulling hairs), but hey I was 17 or so?
I must admit that I’m the complete opposite, I’ve never ‘scripted’ before (with the exception of 30 minutes of frustrated Python screaming “why cant I overload that function?”!).
If asked to write a short program Id always reach for c#/Java, not Python etc. I suppose its what you’re used to and what you feel comfortable with!
I don’t think the user really cares if it’s scripted, they just want it to work. I think an important skill of a developer is knowing when to stop making things more complicated for yourself and just get the job done.
I’ve found 90% of the time, the differences between the technologies/approaches you use dont make a huge difference to the user. So if it doesn’t matter that much, pick whatever you’re comfertable with, the fact that you are more at home in a technology may provide a better solution than if you force yourself to use the ‘correct’ technology (within reason obviously).
I’d argue that drawing a line in the sand and just saying:
on this side is scripting and on this side are the real programmers
Is not useful, nor trueful. Like pretty much everything, it’s a continuum.
It’s not what you use, but how u use it.
Hi. I like your blog. I’m from Chile so I’m not very good at English.
Your article had remembered me the times when I had my Atari 800XL. I tried to wrote some Basic programs and its very pleasing. Next, when I tried to make C programs in my first PC I didn’t understand anything. Only when I went to the University I’ve learnt how to program with compiled languages. I’ve impressed how many time you have to throw away to make a program that with Basic I only needed five minutes.
I like to say that doesn’t matter what type of language one use. It’s more important that if the program is solving the problem that if you use mega-ultra-design patterns.
I have to go. Bye. Keep writing!!
I agree with the majority of posts so far. If you are creating a system for others’ use, then use whatever makes you the most productive and bother all the zealous / real men program in brand X crap.
On the other hand, if you are just goofing off, then pick your own poison. What doesn’t kill you, makes you stronger.