Reducing Your Website's Bandwidth Usage

Over the last three years, this site has become far more popular than I ever could have imagined. Not that I'm complaining, mind you. Finding an audience and opening a dialog with that audience is the whole point of writing a blog in the first place.


This is a companion discussion topic for the original blog entry at: http://www.codinghorror.com/blog/2007/03/reducing-your-websites-bandwidth-usage.html

In your page size comparison you’re forgetting about the effect of blog comments, which probably account for 80% of your text bandwidth on popular posts!

Not sure what the answer is - maybe page the comments rather than presenting the whole list, maybe don’t show any comments for the primary link - but it’s a big part of the problem.

And yes, HTTP compression is a big part of the solution.

  • Roddy

You might also find the book “Speed up your site” handy. its got a lot of good advice as well as having a complementary URL that can review your site for you:

http://www.websiteoptimization.com/services/analyze/

I was taught a dozen years ago to REDUCE image size whenever possible. I see you used a png in your example, but I was able to take that 46k png and make an acceptable 18k jpg. The book that I used years ago, Designing Web Graphics by Lynda Weinman was a big help - Lynda.com. Sadly today, with bandwith at home being nice n fat and print graphic designers working on the web, everyone just puts up the biggest, fattest image, without thinking about reducing every image to that ‘sweet spot’ of small file size and still maintain a good looking image. cheers.

Re: Lee on March 6, 2007 08:07 AM

Thanks for the answers, Lee. I can code up just about any kind of OO app conceivable, but this web thingy gets me all confused, what with the intertubes and all that stuff. My limit is hacking HTML and a bit of JavaScript.

woah, thanks for the link to websiteoptimization.

I have a ton of 17k thumbnails in my sidebar on every page that I thought were a lot smaller.

I’m doing 210kb for a page that has nothing but icons and sidebar images.

was able to take that 46k png and make an acceptable 18k jpg

If by “acceptable” you mean “full of nasty compression artifacts”. The image you’re referring to has strong, delinated areas of color, like a comic strip. Thus, PNG is a better format in this case:

http://www.codinghorror.com/blog/images/gag-fake-dog-poo.png

I’ve written all about JPG in the past, so believe me, I know the tradeoffs:

http://www.codinghorror.com/blog/archives/000464.html
http://www.codinghorror.com/blog/archives/000629.html

9gb in a day is still not that much. If you have good hosting, like dreamhost.com where you get 1tb of bandwidth a month, and it grows, you are not worried about 9gb in a day.

Are you going to consider chopping your posts into comment and non-comment pages, like most blogs?

The most visible RSS change is that it no longer updates at precisely 11:59 every night, now. :wink:

Another useful condent provider network is Amazon S3. We use a combination of Limelight (expen$ive CDN) and Amazon S3 on our site.

  • Dave

Good advice!
I am using gzip to compress my html and javascript files at: http://www.bizdiggers.com, and the size has been reduced about 70%. You can also try it. I also tried to gzip the CSS file, but it doesn’t display properly at some computers (maybe the brower issue)

Interesting. When I used the compression tool you linked to against lazycoder.com/weblog, it reported around 86% compression of the returned page vs. 68% when I run it against codinghorror.com/blog. I’ve got a lot more going on my page vs. your minimalist theme. I’d think mine would be harder to compress. I wonder where the big differences were? I’d think images, but I’ve got a lot more images.

How about swooshing your Website http://www.redswoosh.net/

Thought I’d mention a bug a while back with various versions of mod_deflate for apache. Reloading the service to update configuration changes could sporadically cause pages to hang when trying to load related css files. Just putting it out there in case someone happens to runs into it

Unfortunately, we are at the mercy of poorly coded aggregators.

The polling nature of RSS is unfortunately a huge bandwidth leech. One protocol that can help stem the tide (for a short while at least) is to implement “RFC3229 for feeds” as described by Bob Wyman (http://www.wyman.us/main/2004/09/using_rfc3229_w.html).

This is a HTTP delta encoding protocol, but applied to RSS and ATOM feeds. I spent a lot of time implementing this (http://haacked.com/archive/2005/07/01/Potential_For_A_Subtle_Bug_in_RFC3229_Implementations.aspx) in Subtext and testing it in RSS Bandit.

But like many of my noble but lost causes (http://haacked.com/archive/2007/03/02/A_Comparison_of_TFS_vs_Subversion_for_Open_Source_Projects.aspx), I think adoption rate is too poor to really make a difference. At least RSS Bandit is a good citizen regarding in this regard.

I’ll keep banging this drum, but will aggregator developers listen?

Good Tips, however, I find it unprofessional to host images on 3rd party sites. One reason being if they go down, or quite commonly, are slow!

You could further optimize image size by using pngs for some images. PNG compresses simple images much better than jpeg, sometimes I see a reduction in size of more than 50% depending on the kind of image.

Using advanced jpeg tools, like gimp or photoshop can also help you trim precious kb’s off jpeg photos as well.

You know how some loser-websites block certain browsers (look at them with Firefox and you get “You appear to be using Netscape. Please upgrade to Internet Explorer 4!”). Is it possible to do a similar but less intrusive thing with RSS readers? That is, keep a list of known “evil” RSS readers, and serve up a little preamble at the top of each article reminding the user of their free-as-in-beer alternatives. If RSS feeds serve up an identifier like browsers do, it should be vaguely possible…

Does the mod_deflate works under SSL ( mod_SSL )? I enabled mod_deflate and mod_ssl but log does not write any compression ratio, it just put “-”. The HTTP(s) response header however shows “Content-Encoding: gzip”

great article but you forget to add in that by compressing the data on the fly it will sure increase the load of your server (possible a lot) Might not be a problem for some but when you own your own server load is a big problem when squeezing out all the recourses.