Over the last three years, this site has become far more popular than I ever could have imagined. Not that I'm complaining, mind you. Finding an audience and opening a dialog with that audience is the whole point of writing a blog in the first place.
In your page size comparison you’re forgetting about the effect of blog comments, which probably account for 80% of your text bandwidth on popular posts!
Not sure what the answer is - maybe page the comments rather than presenting the whole list, maybe don’t show any comments for the primary link - but it’s a big part of the problem.
And yes, HTTP compression is a big part of the solution.
You might also find the book “Speed up your site” handy. its got a lot of good advice as well as having a complementary URL that can review your site for you:
I was taught a dozen years ago to REDUCE image size whenever possible. I see you used a png in your example, but I was able to take that 46k png and make an acceptable 18k jpg. The book that I used years ago, Designing Web Graphics by Lynda Weinman was a big help - Lynda.com. Sadly today, with bandwith at home being nice n fat and print graphic designers working on the web, everyone just puts up the biggest, fattest image, without thinking about reducing every image to that ‘sweet spot’ of small file size and still maintain a good looking image. cheers.
Thanks for the answers, Lee. I can code up just about any kind of OO app conceivable, but this web thingy gets me all confused, what with the intertubes and all that stuff. My limit is hacking HTML and a bit of JavaScript.
was able to take that 46k png and make an acceptable 18k jpg
If by “acceptable” you mean “full of nasty compression artifacts”. The image you’re referring to has strong, delinated areas of color, like a comic strip. Thus, PNG is a better format in this case:
9gb in a day is still not that much. If you have good hosting, like dreamhost.com where you get 1tb of bandwidth a month, and it grows, you are not worried about 9gb in a day.
Good advice!
I am using gzip to compress my html and javascript files at: http://www.bizdiggers.com, and the size has been reduced about 70%. You can also try it. I also tried to gzip the CSS file, but it doesn’t display properly at some computers (maybe the brower issue)
Interesting. When I used the compression tool you linked to against lazycoder.com/weblog, it reported around 86% compression of the returned page vs. 68% when I run it against codinghorror.com/blog. I’ve got a lot more going on my page vs. your minimalist theme. I’d think mine would be harder to compress. I wonder where the big differences were? I’d think images, but I’ve got a lot more images.
Thought I’d mention a bug a while back with various versions of mod_deflate for apache. Reloading the service to update configuration changes could sporadically cause pages to hang when trying to load related css files. Just putting it out there in case someone happens to runs into it
Unfortunately, we are at the mercy of poorly coded aggregators.
The polling nature of RSS is unfortunately a huge bandwidth leech. One protocol that can help stem the tide (for a short while at least) is to implement “RFC3229 for feeds” as described by Bob Wyman (http://www.wyman.us/main/2004/09/using_rfc3229_w.html).
You could further optimize image size by using pngs for some images. PNG compresses simple images much better than jpeg, sometimes I see a reduction in size of more than 50% depending on the kind of image.
Using advanced jpeg tools, like gimp or photoshop can also help you trim precious kb’s off jpeg photos as well.
You know how some loser-websites block certain browsers (look at them with Firefox and you get “You appear to be using Netscape. Please upgrade to Internet Explorer 4!”). Is it possible to do a similar but less intrusive thing with RSS readers? That is, keep a list of known “evil” RSS readers, and serve up a little preamble at the top of each article reminding the user of their free-as-in-beer alternatives. If RSS feeds serve up an identifier like browsers do, it should be vaguely possible…
Does the mod_deflate works under SSL ( mod_SSL )? I enabled mod_deflate and mod_ssl but log does not write any compression ratio, it just put “-”. The HTTP(s) response header however shows “Content-Encoding: gzip”
great article but you forget to add in that by compressing the data on the fly it will sure increase the load of your server (possible a lot) Might not be a problem for some but when you own your own server load is a big problem when squeezing out all the recourses.