Die, You Gravy Sucking Pig Dog!

When I first read of GC I had sort of mixed feelings. I tried to like them as I tried to getting accustomed to more modern languages.
Nowadays, I am basically as I was when I started.

If I lived in a more high-level context then I suppose I would now be using GC, but I often deal with HW resources which I simply cannot afford to avoid considering till their sure death.

And I’m still scared as hell by destruction errors (which admittedly usually happen because somebody didn’t its work right) which suddendly start to happen at random.

And I still hear some horror stories.

And, it sucks to say, but I have gone back to the ‘binary blob’ approach. Sometime, OOP and type safety don’t fit it the first place, let alone GC.

I’d love to have less work to do, really. And since even the cheaper PC right now (intel graphics not included) can run everything I do super-fast, I could also consider paying even some perf.

But I think I will still wait a little before switching.

I also argue that the runtime should be smart enough to dispose resources that are IDisposable immediately once they fall out of scope. On some level, the runtime is making me do work it should be able to do. If it isn’t in scope, and there are no references to it, and it’s ultra-mega-system-critical, hey guess what? Go ahead and dispose of it for me.

That’s called finalization. But it’s expensive.

I’d wager the majority of programmers alive today have never once worried about malloc().

From Jeff’s about me page it doesn’t sound like he has either: http://www.codinghorror.com/blog/archives/000021.html.

Explicitly calling dispose on a database connection in .Net causes some VERY nasty things to happen - specifically the connection pool manager gets its knickers in a knot, it never returns connections to the pool, resulting in the pool eventually becoming full and no more connections being possible.

This is complete and utter nonsense (Honestly, junior programmers out there should be extremely wary taking anything they read online. Not only was Jeff’s entry completely wrong on virtually every level, he has a squadron of followers that are just as ignorant, but they desperately hold onto their hilarious nonsense.)

It has been a best practice to always dispose connections since day 1, preferably using the using keyword. Millions of developers are doing it, strangely without error, and have been for years. Those morons that aren’t are enjoying constant connection pool is exhausted errors.

Please forgive the poetic license. I am, truly, the world’s worst C programmer.

I would like to know if Jeff has ever written a C app that ran in production somewhere. For some reason, I doubt it.

Agree with Andr on the worse is better link falsification - that’s a dirty trick that will only damage your reputation.

Am I the only one here who suspects Jeff doesn’t even have a solid grasp on the differences between the heap and the stack? This is a good example reason why programmers should be forced to learn C!

This article is an example of learning to know when NOT to write about something. Not understanding the subject matter well causes too much confusion.

I’ve had a question about this on stackoverflow, of which the answer makes me think connection.close is all you can do to get rid of the resource. The gc makes up it’s own mind of when to release the resources used.

http://stackoverflow.com/questions/12368/how-to-dispose-a-class-in-net

Another vote of agreement with Arienne and Nicolas: good C++ destructors over garbage collection any day for me. When it comes to memory/resource allocation/de-allocation, I’m a control freak. But good class libraries allow me to be a control freak without being a micro-manager as well. We all win.

Garbage Collection still feels like an abdication of basic responsibility on the part of the programmer to me. In my opinion, it is best used in the forked sections of fork-and-die-model services so that whatever unpredictability it introduces (which admittedly is much less than it was in the days of SmallTalk) will only affect one session of the system and not the system as a whole.

As for writing a SQL connection object in any object oriented language whose destruction semantics don’t include a Close-if-open, Dispose-if-necessary semantics – that reeks of kludginess and would make me curious about what other bad design choices are being made in the overall system.

^ Don’t argue with John Skeet
|

In my experience, calling Dispose(), at least on a database connection object, is absolutely necessary.

When we first transitioned to .NET, we had a number of apps which would have connection problems after a short time of use. We traced the problem to the database connections merely being Close()ed and nulled. Apparently, that wasn’t enough. After that, as a general rule, if something is IDisposable it goes in a using statement. No explicit Close, Dispose, or null… Just wave bye-bye as you leave the block.

I’m a C/C++ guy so I don’t know much about garbage collection but what I hear my colleagues saying is that it always kicks in at exactly the wrong time.

Two of my favorite (though perhaps 15 year old) method names:

ADOBitchSlap()
DealWithThisStinkingGodRottingErrorAndTellMeIfIShouldResume()

And for the love of all things holy in the world would the people who are flat out wrong about the CLR’s GC behavior (you know who you are–or at least I do) spend a little snuggle time with Maoni’s blog? An object is not eligible to be collected until it’s no longer rooted, additional memory is being allocated, and the size of the requested allocation exceeds the available capacity of the ephemeral allocation segment (I apologize for just now sounding a bit like a certain earlier polysyllabic-vocabulary-word-spewing ‘Mr. Fancypants’ poster). +1 to everyone who mentioned the difference between resource and memory pressure. -1 to everyone who posted a religious rant or got all up in Jeff’s grill (you know who you are Mr. Poopy Pants!). -NaN to everyone who responds negatively to me (and cons yourself up a big fat that’s what your mom said to go along with it).

Jeff, you don’t have to call Dispose() on every single object. In fact, you can’t because it’s not part of the object class. You can only dispose of objects that implement IDisposable, which is the implementer’s way of saying hey, this thing uses precious resources, you probably want to dispose of it when you’re done instead of waiting for the GC to come along.

So take their advice. Call Dispose( ) on every IDisposable - yes - even DataSets in every single function.

Programming’s hard, isn’t it?

It’s a shame to think that the majority of programmers never had to deal with memory management by hand. I love C# and the garbage collector, but the pain you feel in debugging memory leaks and memory management problems really teaches you to code carefully and, well, to give a shit. There’s a downside to making it easy, allowing us to become to lazy is a bad thing.

But hey, I’m not ready to give up C# and the garbage collector, but I sure am glad I learned how to write code in C and C++.

–Matt

A lot of the people pontificating about .Net GC and garbage collection don’t know as much about it as they think they do, and that recursively includes a lot of the people pontificating about how the others don’t understand .Net GC. I’ll exempt Brian, Jon Skeet, and chris by the sound of it.

A few random points: 1) Arrays aren’t value types, so where do you think all C# arrays are allocated, including big arrays? 2) If you haven’t read Maoni, you probably don’t understand what goes on the large object heap or how the large object heap behaves. It’s not automagically reclaimed as you probably think it is. 3) Setting a local reference variable to null is useless. Failing to set a reference variable in some longer-lived object to null can be deadly. 4) Someone already mentioned this, but setting an event handler for an event is making a reference to the class which contains that event. If you don’t remove event handlers for an object, it won’t go out of scope.

asi if JZ linked to your blog post of January 30, 2008 when he wrote the article on Februrary 1, 1998

JZ is pretty awesome, but I don’t think he’s invented time travel.

Yet.

Calling Dispose is rarely about optimisation, actually - it’s about correctness.

My concern is that it’s a short mental trip for a lot of developers from there is this VERY NARROW set of items that I need to explicitly dispose to I want to manually dispose of everything!.

I worked with developers who insisted that every DataSet be explicitly disposed in Every. Single. Function.

http://www.codinghorror.com/blog/archives/000031.html

I also argue that the runtime should be smart enough to dispose resources that are IDisposable immediately once they fall out of scope. On some level, the runtime is making me do work it should be able to do. If it isn’t in scope, and there are no references to it, and it’s ultra-mega-system-critical, hey guess what? Go ahead and dispose of it for me.

Jeff, JZ did not link to your blog posts, especially on Worse is Better, where he explicitly linked to his own blog post about it, not yours!

The way you put it seems to me like you’re wanting to make it look like JWZ is linking to your blog posts, which would be remarkable - but isn’t true. And it scratches your reputation a little bit (at least for me).

Explicitly calling dispose on a database connection in .Net causes some VERY nasty things to happen - specifically the connection pool manager gets its knickers in a knot, it never returns connections to the pool, resulting in the pool eventually becoming full and no more connections being possible.
This problem esists in .Net 1.1 and .Net 2.0, not sure about .Net 3.0