Url Shorteners: Destroying the Web Since 2002

Is anyone else as sick as I am of all the mainstream news coverage on Twitter? Don't get me wrong, I'm a Twitter fan, and I've been a user since 2006. To me, it's a form of public instant messaging -- yet another way to maximize the value of my keystrokes. Still, I'm a little perplexed as to the media's near-obsession with the service. If a day goes by now without the New York Times or CNN mentioning Twitter in some way, I become concerned. Am I really getting all the news? Or just the stupid, too long, non-140-character version of the news?


This is a companion discussion topic for the original blog entry at: http://www.codinghorror.com/blog/2009/06/url-shorteners-destroying-the-web-since-2002.html

I have hated URL shortners from day one. I never click on a single one no matter the circumstances. Also, i HATE twitter.

Doesn’t anyone else think all the twittermania is a carefully engineered marketing plan? I mean come on, what does twitter do for a celebrity that a regular blog doesn’t? I’m sure there’s some money exchanging hands here.

The problem isn’t with Twitter or the URL shortening services. The problem is that you’re using the wrong tool for the job. Twitter is generally not a good platform for…

“brief summary or opinion” [link for more detail]

Less that 4% of my tweets have required the use of tinyurl. Because I don’t try to use Twitter for something Twitter isn’t good for. Few of my tweets include links at all and the vast majority of those are twitpic.

Twitter could certainly become a better platform for such things, but—so far—they don’t want to. It seems a bit counter to their intentions to me. Try adjusting your use to better match what it is (as the whole 140 character limit suggests), and see if you don’t enjoy it more.

Generally, I do think URL shortening services have a lot of problems, which is why I avoid using them except in the rarest circumstances. Again—using it in a way that works makes me happier than complaining that abusing it doesn’t work well.

hashing urls for shortening doesn’t really make sense because we want a function that takes a small input and produces a larger one not a function that takes a large input and produces a smaller one.

Huh? WTF Are you talking about? Is it just me or are you talking about something that is completely meaningless and has absolutely no impact whatsoever? Whats twitter? Whats a tiny url? Why do we need tiny urls? I use the internet alot, everyday, alot. and I have NO idea what youre blathering on about. Sorry, i tried.

Personnally, I find shrinkster and tinyurl way to long. Is there anything shorter than tr.im?

I can’t wait until the uselessness that is Twitter goes the way of Geocities.

Why not use the approach you suggested just a few days ago? Ignore twitter. Ignore URL shorteners. You only give them power by using them.

I don’t twitter, I don’t use URL shorteners … I have never felt better in my life! :slight_smile:

Great post … good discussion … however has the idea of “web standards” been forgotten? I think the biggest issue raised here was the subversive effect url shorteners have on the web. However, as some said they are the public response to a real problem. As usual, the standards organisations and the big men in town are way behind the pack.

It’s just a gold rush and the shortener services are mining the free claims while they can. No need to panic, but definitely interesting sport.

First!

I think there should be some webservice that makes it easy to add real semantics into the web.

Have you been following any of the rev=canonical discussion? See, for example, http://simonwillison.net/2009/Apr/11/revcanonical/ or http://shiflett.org/blog/2009/apr/save-the-internet-with-rev-canonical.

Briefly, the idea is that sites should control their own URL shortening, thus side-stepping some of the problems that you describe.

Are there security implications to the growing popularity of URL shortening services? E.g. users become familiar with tinyurl.com etc. and start to trust it as a host when in fact a cracker could be using the service to redirect to their malicious website. And of course there’s no way of seeing where you’re going to end up without navigating the link (is there?)…

Surely the more worrying problem is that Twitter becomes useless to researchers in the future when all the URL services are dead and no one knows who you linked to? Twitter should start storing the full URL or provide its own service. And the URL shorteners should be open about their databases too, to put them in escrow.

What “sane” way to link a word would be more character efficient than what we have in Twitter now? All the code you introduce with HTML or phpBB or Markdown adds a lot of extra characters. That 140-character limit isn’t a display limit, but a data transmission limit. (Likely based on the 160-character data+headers limit of SMS.)

As for your suggestion of search engines generating “short hashes” for every URL, you yourself answered that in your previous entry – the hashes would be too long! Not every URL will ever need a short version, so the current iterative schemes, combined with an ever-changing variety of domains providing the service, is much more space-efficient.

Stopped reading after the first 140 characters. Was it good? :stuck_out_tongue:

1 Like

Most of us never get there. I don’t think I will.

Aww, Jeffy, SO’s awesome. it’s just a far narrower subset of web users using it.

There are some Firefox extensions that deal with the problem of shortened URLs. The one that I currently use is NoRedirect, which lets me force a “preview” layer on all of the shortened URL services.

(https://addons.mozilla.org/addon/11787)

One weird think I’ve noticed twitter do was if I put a longish URL in my tweet, even if the whole tweet is less than 140 characters it still converts it to a bit.ly URL.

What’s up with that?