Inspired by a few articles I came across recently, namely:
- http://css-tricks.com/images-on-a-subdomain/ (Chris' move to a separate domain)
- http://developer.yahoo.com/performance/rules.html#cookie_free (rules for improving performance)
- http://sstatic.net/ (Stackoverflow's static site)
- Increasing the number of simultaneous requests
- Reducing the size of the requests/responses to and from the server(s)
1) is achieved by serving the images, css and javascript files from a separate source to the HTML pages. So as you're busy downloading the homepage from slickhouse.com the images are being downloaded from slickhouse.co.uk - bypassing the 2 HTTP requests limit set by the HTML specification.
2) is achieved in the same way, by switching off cookies on slickhouse.co.uk - which in turn, can help reduce the request/response sizes and thus the page load times.
My initial testing has shown a noticeable improvement, though I don't have any metrics to share. I used Microsoft's Fiddler tool to profile the load times and was surprised how much external content the site uses, from 3rd parties. The twitter feed on the right is 2 requests alone and the Google Map that was tucked away in the site's footer added a further 20 or so. This gave a sluggish feel to the site as each page loaded.
So I updated the theme files and removed some of the excess requests, to bring it down to ~14 for the homepage. It's still high and could be improved further using CSS sprites. But I'll save that for the next version.
To summarise, splitting your static content from the dynamic pages helps increase page load times. It also allows for future expandability, as the static content could be hosted on a separate server, or even on a cloud/CDN solution.