Ok, I am a blekko.com employee (and a former Google employee so take my comment with a grain of salt. But one of the reasons I joined up with Blekko is that I believe Rich Skrenta’s fundamental tenent that “algorithmic” search was always a hoax. (He doesn’t state it like that but it is what the reasoning comes out to :-).
Basically “algorithmic” search is code for looking for “signals” (which is code for an HTML construction that indicates a value intent) and applying those signals to a list of possible results. Back in the way back times that was everyone had a ‘links’ page of sites they thought were the coolest/best. Google could scrape those, infer the intent, and then rank based on linkage (the original Backrub algorithm).
The achilles heel of algorithmic search is this, “What if you have people who lie?” Which is to say that an algorithim cannot tell, a priori, if the web page it is scanning, which was written by a human for human consumption, was written “from the heart” (which is to say original content, original expression) or was written “from the wallet” (which is to say to specific key word and phrase requirements). Since human labor on the Internet is cheap, an algorithm based on infering human intent cannot discriminate between “good” humans and “bad” humans.
Blekko’s premise is that people know good content. And if a small fraction of those people are willing to take a bit of time to identify the content that is “best” for a given category. Blekko enables that understanding to be codified into slash tags, which are a community resource. Thus a small fraction of people with good taste can create a much better search experience for everyone.
Of course their remains the question of why can’t evil humans do the same thing to Blekko, by creating their own slash tags which have primarily their ad revenue generating content? The answer is that while such slash tags can be created, you as the user decide which (if any) slash tags you want to use to filter your results. If you try a user’s slashtag and find it is full of spammy links you don’t have to use those links, what is better you can use them as “anti-filters” which is to say exclude any sites this spammer has in their slashtag from the returned results.
Unlike the curated directory which Yahoo! pioneered in the 90’s, Blekko crawls the web like an algorithmic search engine, and then sieves the result through what is a community constructed filter of quality. The goal is a scalable, robust, search engine with consistent high quality results. Content farms and content duplicators have to fool a human to get into a slashtag, which thankfully continues to be an unsolved problem.
Great discussion by the way on this.
–Chuck