There are a few interesting tools around to analyze the speed of your website. Yahoo’s YSlow and Google’s Page Speed (both Firefox plugins) are a good start and offer a lot of advice and background information. In this post you’ll see graphs from Webpagetest.
This website offers the best visual analysis IMHO and shows exactly how the page is loaded, which file was received in which connection and at which time. I’ve used it to optimize my file sync webpage and will use the steps that I made as example. Here’s my starting point:
Each bar represents a file. Time goes from left to right (less is better) and the different colors represent the different aspects of the file transfer.
Tip #1: File size doesn’t really matter
This may be hard to believe, especially if you used to optimize your site in pre-DSL times. But check the image again and look for blue bars. Blue means the actual data transfer times, everything else is just connection stuff. Right, there’s not much blue.
When analyzing my site the first time I was surprised to see that things weren’t done all at once. But one after another with only a few parallels. The graph shown above assumes that your browser uses 4 parallel connections to load data. Only when one of these 4 connections has loaded a file completely the next file is also requested.
And these requests are quite slow, so keeping the number of requests to a minimum is the important factor now. The reason is that the HTTP/1.1 standard recommends using a small number of parallel connections. The reality differs in terms of browsers:
|Browser||Parallel requests (default)||Configurable?|
|Opera10||16||Tools > Preferences > Advanced > Network|
Tip #2: Reduce number of files/images
CSS sprites are another way of reducing the number of images. If you have many similar ones like menu icons or buttons with overlays you can put them into one single file and use CSS tricks to show only the part that you need. I’ve already been using them, otherwise the loading time would have been a lot worse.
Special conditions apply
After making these changes, my website already loaded much faster. It started to render (vertical green line) after 1 second (before: 1.5s) and had the document completed (vertical blue line) after 2.8s (before: 3.4s). (The files coming after the blue line are the favicon and extra loads from a script.)
Forcing parallel downloads
The limitations for parallel requests were implemented to keep the browsers from over stressing servers. But usually the servers can handle a lot more. Since the limitations mentioned above are per domain, you can use extra domains (which may be hosted on the same server) to force the browser to open more connections.
I tried that by using the subdomains img1.easy2sync.com and img2.easy2sync.com. The downside is that the browser will need to perform extra IP lookups (even if it’s only a different subdomain). You can see this extra time as 2 new dark green boxes.
You can clearly see that now more downloads are done at the same time. The start-to-render time is almost the same and the document-complete time has decreased from 2.8s to 2.2s.
Tip #4: Use extra domains. Maybe.
Cutting connection overhead
The orange parts in the graph are interesting, too. They show the time required for the Initial Connection. You can see this orange part in every row since my server didn’t support the “Connection: Keep-Alive” feature. This feature enables the server to re-use a connection, after a file was transferred completely (instead of closing the connection and opening a new one). All current browsers support it, but maybe your server doesn’t.
Tip #5: Turn on “Connection: Keep-Alive”
My server didn’t, and it took some time until my hosting company fixed this after I inquired. You can see in the next image that most of the orange bars are gone. Since this benchmark was done much later you can’t really compare it to the previous benchmarks (some things are shown as slower now for no obvious reason), but it’s probably still safe to assume that this change improved the speed.
Tip #6: Use Expires or Cache-Control Headers
- For static content add “Expires” header and set it far in the future. This means that a static file like http://www.site.com/images/logo.gif, which has a low probability of changing in the future, will “never expire” so the browser will not repeatedly download that file each time it’s requested and will grab it from the cache instead.
- For dynamic content that can change in the future, like the CSS files, you can set add “Cache-Control” with max-age=[seconds] option. This is similar to “Expires” but this directive is relative to the time of the request, rather than absolute. [seconds] is the number of seconds from the time of the request until the browser will reconsider refreshing the file.
During this session I cut down the time till the document is complete from 3.5s to 2.2s. Making the site faster for the customers is only one aspect here. Page speed is also part of Google’s ominous “quality score”, so it might even influence your website’s position in the search results.
On the other hand, it’s never that simple. The speed differs with the location / connection of the user and the browser they use. Image 4 also shows that benchmarks may also be different some time later for unknown reasons. But faster is still faster and spending some time to optimize your site might be worth it. To start things, simply visit Webpagetest and enter your page URL.
Thomas Holz is the owner of ITSTH and the author of outlook tools to synchronize, remove duplicates and use boilerplate texts and writes in his devblog, if he still has too much time after optimizing the website.