YSlow: Yahoo's Problems Are Not Your Problems

Last-modified:
A weak validator? WTF. Only if you use it as one. You can put any date you like in there. I think that page you link to is talking nonsense. Most site would change the last modified date whenever anything on the page changed.

Stoyan, the ETag Yahoo is talking about is not for small servers it is for Web Farms, where the same server you might make one request from isnā€™t the same server you may go to on the second request.

Also it is generally not a good idea to turn off ETags if you content is only hosted on one server like a Blog. However if you are running you Blog from an IIS server you should be aware of this ETag bug in IIS.

http://dotnetjunkies.com/WebLog/leon/archive/2005/02/16/54630.aspx

The quick way of getting rid of ETag in IIS is to go to the HTTP Headers tab in the properties of your website and Add a new HTTP header. Name = ā€œETagā€ Value = ā€œā€ This will remove the ETag from the header.

Nick

Of course itā€™s a vast oversimplification to simply give a score for a website, but presumably the people who use YSlow have enough common sense to realise this.

I actually tend to use a similar tool at http://linuxbox.co.uk/website_performance_test.php as well as YSlow - both give slightly different advice

I always wondered: /Why are they outright telling me that I must turn off ETags on account of server farm problems which I simply donā€™t have?/

I knew it was badā€¦ But I turned them off anyway on a particular site, as at the time we were kind of competing to see who could make the most dramatic score improvement on any existing site (with YSlow).

The article, though, is awesome.

Thanks for this great article. Iā€™m in the middle of shrinking the footprint of uxbooth.com. Like many, Iā€™ll be following along with the advice from Yahooā€™s blog. I like that you analyze each point they make and explain them from a practical point of view. Indeed, we donā€™t all have top-tier sites.

I will be linking to this article from my blog-post. You just got a new reader :slight_smile:

Thanks,
Andrew

Great read, I am beginning to wonder since I also implemented Yahooā€™s 13 rules if I am somehow hurting my site.

I have increased speed dramatically (which is good), but have lost a significant amount of traffic ever since. I donā€™t understand and cannot pin-point the problem.

Anyone care to take a look at what could be the culprit? Head over to http://www.geekberry.net/ - Itā€™s a wireless technology site. At one point I was doing 360,000+ visits a month. Now I hardly due half! :frowning:

@Giancarlo: is that supposed to be a troll? Not very creative way to get readers for your site :slight_smile: Though granted, it might appear you actually read the post.

Now, the culprit could of course be, you are measuring ā€˜visitsā€™ wrongly. Oftentimes ā€˜hitsā€™ are confused for ā€˜visitsā€™ and some log analyzers will simply fail to spot that multiple requests form a single visit.

So reducing requests (html,js,css,whatnot) will - ideally - vastly reduce the number of hits (a good thing: as in avoiding taking a hit). That doesnā€™t mean you get fewer visitors. Iā€™d check your log numbers to see whether you might have gotten more visitors, only serving them in fewer request/responses?

I wish more people would evaluate advice from authorities such as yahoo and google. My boss decided to implement every one of those rules, but in the case of the cache header he did a blanket +10 years expiration date for the WHOLE site - html and all - and the site gets updated at least once a week if not more so. It has been corrected but unfortuatnly the htaccess file was up for a whole month. And come to find out - not everyone clears their cache on a regular basis. So here we are - months later still walking people through how to clear their cache so they donā€™t see a site in time warp.

wow, dude you suck. every little byte and http request saved is important regardless of how heavy your site traffic is. a grade of 70-something on yslow is very bad even for ā€œsmallā€ sites that you use as an example, you can get at least a grade of 90 even with an ā€œEā€ for cdn. you should go back to school to learn this stuff or get a new career.

For more info on decreasing latency, Aaron Hopkins from die.net has an interesting article up showing the effect the keep-alives and multiple hostnames (among other things) have on page load time:

http://www.die.net/musings/page_load_time/

1 Like

The distinction between weak and strong validators is not relevant to caching of entire files. Where it is relevant is in resuming broken downloads of very large files (notice how browsers will do this if you break after a few megs of a many-meg file). In this case its important to know that the file you are resuming the download of is byte-for-byte identical to the file you started downloading.

Weak E-Tags (with W\ before the quotes) and last-modified donā€™t make that promise to the client, strong E-Tags do.

E-Tags for web-farms can be implemented by generating the e-tag yourself, if each server (or each server process in a farm of gardens) will produce the same e-tag for the same entity, then this can work perfectly.

The cache blanket of 10+ years should be fine for the support resources such as JS or CSS. So long as you update them when you update your main page.

Your main page does not require a long expiration to pass YSlow! or PageSpeed. With the newer frameworks such as webpack it can do this for you quite easily by creating the bundle with a different name if it has different content and still build the HTML file for you using html-webpack-plugin.

This is the approach I took with my portfolio site and the sources are available in https://github.com/trajano/trajano-portfolio