I traversed from BASIC to C in much the same way as Jeff, but had a very different experience. The reason is that I had learned BBC Basic, which had a built in assembler, and slightly more powerful indirection operators than PEEK and POKE. In short, by the time I approached C, I already had an intuative understanding of what pointers were… and was frustrated by the pain of managing memory for myself (malloc was lovely by comparison)
Nevertheless, I also have to give a lot of credit to the book Illustrating C by Donald Alcock, which is the best introduction to pointers (and using pointers and pointees in algorithms) I have come across, and taught me far more confidence in the C landscape.
While Jeff went on to be a scripter, I spend my days writing device drivers and network stacks… in C. But if I was asked to write something higher level - well, there the more controlled, managed languages (be they scripting languages like python, or compile/run loop languages like Java) win every time.
I’m noticing here a lot of people complaining that some things don’t work like their favourite tool. Which is a fine thing to do, if you are happy to use the hammer you know and love to remove the couplings from pipes because you don’t get on with wrenches. Or if you’re happy to limit yourself to hitting nails into things. Jeff doesn’t use C, because Jeff doesn’t write low level code (and it seems to me, he is perfectly happy about that).
There may come the day Jeff isn’t able to solve some low level problem because he doesn’t understand everything that is going on with the system below his code - but I get the same thing when bits of hardware don’t work as expected… my Verilog is rusty and, for the most part, I’m happy about that.
Your pretty down on C all the way till the end where you give it the worn use the best tool pat on the back. It should be noted that in order to do anything with your scripting languages, you need some libraries that are written without them. You are a user facing programmer, and you program stuff to look pretty and be nice to the user, but don’t forget that your perspective on the grand world of programming is pretty darn well limited.
Webkit, trident, gecko, the things that provide everything you need to make things of beauty.
What about the .Net or Java?.. Your operating system?
What you’ve found is that it’s pretty stupid to program user facing programs in low level languages. That’s a daily wtf no duh, except in some cases where portability is a big deal. Enjoy your scripting, and don’t forget that for every hundred of you, there’s a real programmer on whom your precious script depends.
The comments on the thread remind me what I hate most about working with computers: all the macho dimwits that think getting a computer to do something is hard. Wake up! Smell the coffee!
Try getting people to do things, try getting your girlfriend to make you happy ;0)
Most of you will spend the rest of your life in misery bashing out code or even worse, managing other coders because you don’t realise how important it is to be able to communicate with people.
Please excuse me for going go off-topic in order to shed some light on a few inaccuracies:
Perl 6 is released for quite a while now. Still incomplete, but usable and true to the release early, release often paradigm. Also, as Larry Wall points out, Perl 6 is not his piece of work: Perl 5 was my rewrite of Perl. I want Perl 6 to be the community’s rewrite of Perl and of the community.
It has been in development for quite a while now, true, but most people really involved with it expected a big timeframe like this (i.e., ~10 years). After all, their goal is big: Perl 6 includes operators and language constructs not found in any other imperative/object oriented language (except Python, perhaps ), some of them comparatively new in language theory.
Moreover, the most convenient Perl 6 constructs leak into Perl 5, so if you’re using 5.10, you’re already partly in Perl 6 land.
Its always hard to draw the line between scripting and programming. personally i think the very nature of the two concepts makes the line impossible to draw. if we assume that to do programming you must do memory allocation then almost nothing is programming anymore and almost everything is scripting.
For me scripting has been more about telling programs what to do on a high level, where as programming is about creating a set of instructions to solve a problem yourself. scripts are data files loaded by some executable whereas programs are the executable itself, which means that stuff like VB6 is especially blurry using my measures as it can be interpreted or compiled…
IMO the true realisation is that all languages are programming languages, but you can use programming to produce powerful scripts. hence scripting languages, in order to be powerful, should provide programming constructs and tools…
I think of HTML as a scripting language for example, it provides a set of instructions on how a browser should display a page… however it does not provide programming functionality out of the box, for this we introduce something like JavaScript, which provides more power by allowing the use of programming constructs within a HTML document, ultimately though the context is still providing instructions (i.e. a script) to the browser, so HTML with JavaScript is a script, but it involves programming none the less.
From my perspective you’ve confused high-level with scripting and low-level with real programming… its all programming, just with different levels of abstraction.
‘Unlike applications from the previous paradigm, web applications are not released in one to three year cycles. They are updated every day, sometimes every hour.’
Which is why scripting languages are simply too slow and unproductive for use in the modern world.
Sure, back in the 1980s, it might have taken 4 hours for some primitive 8 bit computer to check your program for correctness. But that was then - get over it.
Outside games and system programming, you don’t need to manually track memory allocation because you don’t trust the computer to do it fast enough. So languages without some form of automatic memory allocation are a niche market.
Similarly, scripting languages that report errors at run-time, instead of at typing-time, are pretty much a niche technology. The last edge-cases where you had to manually check your code because you didn’t trust the computer to cross-check it fast enough went away a few years ago.
No seriously, implementation work on Perl 6 did not begin until about 2005. There is now an implementation of Perl 6 on the Parrot VM, called Rakudo, which is unfinished but complete enough that it actually makes sense to start writing simple scripts in it.
Interesting, these days, using Python feels a lot more like programming than Java… In Java all I seem to do is set up a bunch of config stuff for spring, os workflow, hibernate, kodo, etc…
This blog to me sounds like someone who has forgotten the ends for which he or she started to learn and master the means for. Why do computers exist? To solve problems. Why do programming languages exist? To give us a way to tell the machine what to do. There are some problems for which a compiled language, like C/C++, are better. These would be problems where size and speed matters. There are problems where scripting languages are appropriate. This would be small problems for which building a C application is overkill, or problems for which the solution is we based, in which case, the portability is required.
Stop focusing on whether a language is compiled or interpreted. Start focusing on categorizing problems and determining which language is appropriate.
you hated C? how odd… would have pegged you for a compiled language kind of guy.
Me, I loved KnR C (even pre-ANSI).
Wrote all sorts of stuff with it.
Animation tools, Ray Tracers, and even for school work.
But I still spend MOST of my time doing Perl5.
Forced to put my finger on it, I’d say it due to a mix of
not having to compile, and having all the OS commands easily available.
Meanwhile, Perl6 sounds a bit non-backwards compatible from the small snippets I’ve read. That will greatly inhibit it’s adoption, though the long delay and the odd perception that it’s a ‘mature’ language at this point doesn’t help either. The death of perl is overstated.