Web-tech nerd stuff

How to reduce web page download time in 4 (fairly) simple steps

Web pages have generally become much more bulky in recent years. The trends for JavaScript-based functionality enhancements and the popularisation of digital photography have led to larger amounts of data being transferred (downloaded) per page and I can’t see a reversal of this happening any time soon.

If you’re using a mobile device such as an IPhone, IPad, Android handset or indeed any other device on anything other than a very consistently fast web connection, you’ll no doubt have experienced the frustration of slow page loads. So what can web developers do to improve the situation? We’re extremely unlikely to be able to convince web designers to use less graphical content in their layouts and customers certainly won’t like being advised not to upload imagery along with their textual content…so we need to make improvements from the source.

The 2 major factors (aside from the data connection which, beyond the data centre we have essentially no control over) in web page download speed are:

  1. The amount of data which has to be downloaded in order to display the web page – This comprises source code (E.G. HTML, JavaScript, CSS etc) and media items (E.G. Images, Movies, Sound files etc)
  2. The number of HTTP requests made – Each item (E.G. an Image, a JavaScript or CSS file) which is downloaded requires an HTTP request and each HTTP request has a latency, a delay which further slows down the page load

There are 4 fairly simple improvements which can almost always be made to a website which will provide some pretty big improvements alongside the traditional advice of “Export your JPEGs on the lowest acceptable quality setting”:

  1. Minification – Combining same-type source files (E.G. JavaScript, CSS etc) into a single file (to reduce the number of HTTP requests) and compressing that single file (removing unnecessary data to reduce the downloaded file size)
  2. CSS Sprites – Combining all possible CSS-specified images into a single image file (to reduce the number of HTTP requests)
  3. Serving page assets (E.G. JavaScript, CSS, Images etc) from multiple (sub)domains. Web browsers have an in-built (or default) maximum number of concurrent connections which they will make, usually this limit is on a per-sub-domain basis. The theory is that the more concurrent connections will result in a shorter web page download time. You can read more about default maximum number of concurrent connections for each of the major web browsers here.
    Of course, your (sub)domains can be synonyms (E.G. CNAMEs) for the each other, as far as the browser is concernced, that’s still a different (sub)domain.
    You should be aware that using more concurrent connections is very likely to increase your web server load so if your server margins are tight, proceed with caution!
  4. Web server output compression (HTTP compression) – Virtually every vaguely modern web server and web browser supports compression/decompression of web page source code, usually via GZip and/or Deflate methods – this works in a very similar way to “zipping” a file on your computer. Enabling output compression is usually as simple as changing a setting in your web server configuration file – I’d suggest searching the web for a tutorial if you don’t know how to do this.

On top of this, you should think about your markup, front-end scripts and stylesheets – keep it as minimal as you can whilst supporting all those nasty legacy web browsers (IE6 – that means you!).

So there we have it, 4 pretty simple steps to help combat the 2 biggest demons of web page download speed, data size and number of HTTP connections.

I hope that helps…


Created: Fri, 01 Oct 2010 09:00:00 GMT
Last modified: Fri, 01 Oct 2010 09:00:00 GMT