All Abstractions Are Failed Abstractions

Why aren’t you using a stored procedure ?

I tried a lil test of my own which goes

select top 1000000 * from table

select top 1000000 ID from table

select * from table where ID in (select top 1000000 ID from table)

These took

118 seconds

8 seconds

175 seconds on the third query.

Rather the opposite to us presuming that getting 1 row takes 1 second, so getting 48 rows with an IN clause takes 48 seconds.

Where is the NOLOCK behind the sql if we are talking about performance?

If all abstractions are failed abstractions, then we shouldn’t be using HLLs at all! We should all be programming in C! Or, better yet, Assembler! Or machine code!

All I’m trying to say is that abstraction is good to a degree. Too much is bad, but languages that abstract the creation of assembler instructions are Good Things.

If all abstractions are failed abstractions, then we shouldn’t be using HLLs at all! We should all be programming in C! Or, better yet, Assembler! Or machine code!

All I’m trying to say is that abstraction is good to a degree. Too much is bad, but languages that abstract the creation of assembler instructions are Good Things.

Hey Jeff…loved the iPhone post, didn’t love this post.

LINQ to SQL (or better, yet, call it “LINQ to SQL Server” as that’s all it is, kicks ass. It’s just a nice fluent DSL over SQL. That’s it. It’s the thinnest of abstractions.

You really need to correct your code. I’d be happy to do a code review of StackOverflow with you and your guys.

  • You don’t need to call ToList(). Stop doing that and let IQueryable do its job.
  • Use more projections and select new {} anonymous type or select p.ID, rather than select p.

Thanks!

Jeff, frankly speaking for once, I think you are misguiding your readers. You ask for “Select P” and complain about that getting converted to “Select *”. If you really wanted only 1 column or a few columns you would certainly use select p.Id or select new {p.Id,p.Name} etc.

Ferdy wrote:

However, I would expect that people put effort and
competence into tuning, and that buying a server is
not the default solution.

And you think there is not cost to the environment, energy and the human soul from doing that? Perhaps in the time saved debugging, and handling user complaints, the programmer could be out riding a boat to save the whales.

@JessicaBoxer: Perhaps the programmer could spend time becoming a better programmer!

Given the fluid syntax, yet indirect control we have over the actual statements sent to the database, I approach LINQ to SQL with cautious optimism.

I see the relationship of LINQ to SQL vs. direct database programming as similar to C# vs. machine code. I trust that the C# compiler and the CLR JIT compiler will produce machine code from my C# code that is sufficiently performant such that the benefit of writing in C# far exceeds the cost of writing machine code directly.

And given that SQL statements are being compiled from my LINQ statements, the “select *” issue may very well be optimized out in the future, and likely with no change to the code I’ve written.

@JessicaBoxer

Your point is funny but from personal experience there is a quite small limit to how often you can just “add a server” and see a real increase.

I recently upgraded our ERP server from a quad-core with 3 gigs of RAM and 3 disks to an dual quad-core with 32 gigs of RAM, 17 hard drives all 50% faster, separate physical RAID 10 Arrays for the OS, Applications, Database, Log Files, and Temporary database files.

The net result? A 5% increase in performance. Now I know this ERP system like the back of my hand (but didn’t write it) and I spent a long time making sure that my hardware configuration exactly follows the vendor’s so-called “best practices”.

It’s simply that the system it runs uses a poorly designed and very leaky database abstraction which makes poor assumptions about its physical environment. (you know like one of those academic papers, “assume the network is of infinite bandwidth and hard drive access is instantaneous”).

Bottom line, we paid 15,000 Euros for a 5% performance increase, how many more times do you think any given company could afford to do this? Plus, even if I wanted to, there is a physical limit on how fast a server I can get.
The only way to fix the problem is to fix the abstraction, i.e rewrite the shitty code.

I’ll admit, I’ve never used LINQ yet, but I was interested in possibly adapting it in the future. From reading this, it sounds like a revamp of old-school VB6 “Open Recordset” for people who can’t quite get their heads into set-based operations instead of stepping through rows? If that’s the case, it’s not for me but I see the use.

However:
In the original example, where did the “Top 48” come from to be introduced into the LINQ sql statement? Did LINQ determine there were 48 records in the set to be returned, because I don’t see how Jeff’s parameters would have otherwise indicated he only wanted 48. Or was this snippet a pseudocode with the limit set outside the example shown? I would find it interesting if LINQ pre-calcuated it’s own statistics, and would have to consider that maybe Microsoft launched a second SQL optimizer team within LINQ - which I would explore.

snarf! Using Joel’s argument, we may as well go back to using an abacus because all improvements afterwards “leak”.

Bah!

Writing correctly executing “slow” code is more important than writing incorrectly or poorly executing “fast” code. Write correct first, optimize where necessary later. Abstractions help us write correct code more easily.

I don’t like LINQ2SQL. If speed matters then the SqlDatareader will win every time. And why create all these new DSL:s - reason: because the old stuff is so boring to write!!! Good reason -( * NOT * !!!). I think a lot of these software methodlogies are crap, because one day you believe in Method A and the next the day it’s Method B, you change your mind like you change underwear!
Now there’s coming this Entity thing, and this NHibernate is here already and LINQ - but when you use the sqldatareader life is sweet and simple! It’s fast, simple and reliable. Would you want it any other way?

I don’t like LINQ2SQL. If speed matters then the SqlDatareader will win every time. And why create all these new DSL:s - reason: because the old stuff is so boring to write!!! Good reason -( * NOT * !!!). I think a lot of these software methodlogies are crap, because one day you believe in Method A and the next the day it’s Method B, you change your mind like you change underwear!
Now there’s coming this Entity thing, and this NHibernate is here already and LINQ - but when you use the sqldatareader life is sweet and simple! It’s fast, simple and reliable. Would you want it any other way?

So many comments by so many programmers that don’t know the first thing about Linq, much less LinqToSql.

These programmers are leakier than the abstractions they complain about.

I think LINQ sucks, always something that goes wrong so I use my Sqldatareader and push it in a generic list. Works everytime!!

@Andrew

Snarf? Isn’t that the little guy from ThunderCats?

Hi,

as some of the commenters already pointed out there is a serious leakage in the reasoning behind the samples in this post, or to put it more bluntly THEY ARE WRONG. This serves as a trmendous disfavor to the creators of LINQ and all those developers who have been using it in their work (as in, “Hey, are you using this crap that Jeff denounced recently?”)
So, I propose to you Jeff to create a new post and to apologise for your mistake and openly say that in fact LINQ is less leaky as you claimed.

Przemek (software developer at Goyello).

Tight Coupling is not the perfect answer but is a “simple” solution that works. I may not know a lot but I’m humble at least, instead of those arrogant and ignorant bastards who haven’t a clue, but believes they are “jedi masters”. And playing WOW doesn’t make you a computer genius if you thought so!!!