Optimizing your website to increase performance

Picture of a speedometer in a car
fb twitter linkedin

So you’re finishing up a website, all the content is there, pages ready to go. You think you’re ready to launch, but you decide to check your website on GTMetrix just to see how it performs. You’ve been conscious of good practices in your code, you’ve been optimizing images along the way, surely you will perform well right?

Initial page speed analysis
Initial GTMetrix performance report

Load time of 7.5s, size of 2.14mb and 98 requests = Not a great show. Guess it is time for some optimizations!

You may be asking yourself, if the content is all there and working why are optimizations so important? The primary job of a website is to facilitate interest and meaningful interaction from the user. If a website takes too long to load, or performs poorly, the user is more likely to navigate away from the page and forget about you forever. The average fully loaded time for a webpage hovers around 8-20 seconds, however studies have shown that users are likely to leave the page if a page takes more than 3 seconds to load. Performance is all a part of the user experience. If you want the user to leave your website with a positive impression then performance is something you have to consider. Below, I’d like to share some of the optimizations we made when releasing TTT Studio’s new website.

Start by installing a caching plugin

The good news is that you’re not the only one looking to optimize your website. It’s so common, in fact, that many tools exist to handle these optimizations. Some of them free!

A great place to start, and an easy first step to improving performance is installing a caching plugin. It will allow you to do a whole host of things, including minifying resources, browser caching, database caching, object caching etc. It doesn’t matter too much which one you choose, as long as it’s one that’s popular and well-maintained. Some good ones to use are W3 Total Cache, WP Rocket, Hyper Cache, Cachify and WP Fastest Cache. You can make the optimizations without the use of a plugin as well, but a good plugin helps to streamline the process. We chose to use  W3 Total Cache because it does everything we need and is still supported and maintained on a regular basis. There are many recommended optimizations to be made with a caching plugin, but in the interest of time, I will touch on two main ones.

Browser caching

Very close to the top of the list of recommendations on both GTMetrix and Google PageSpeed Insights is browser caching. Browser caching saves static resources in the visitor’s local storage so that the browser doesn’t have to ask the server for those resources on subsequent visits. By setting up “expires headers”, it tells the visitor’s browser to not ask for this file again until the expiry deadline.

The only thing to be aware of with browser caching, is making sure visitors get the most updated version of your website. If you make any changes to the website, often times the old CSS will still be cached even though the HTML structure has changed. This can cause the new elements you’ve implemented to appear broken.

To combat this, you can add query strings to the end of the resource URLs. This will slightly increase load time, but since we are still making active updates to the website, it is worth taking a small hit to performance to ensure your content is being displayed properly.

Minify all HTML/CSS/JS + Enable gzip compression

A great way to speed up loading time is reducing the file sizes of all your code files. Gzip is a useful application that compresses your webpages, scripts and style sheets before sending them over to the browser. This drastically reduces transfer time since the files are much smaller.

Another step towards decreasing file size is minification, which is the process of removing and optimizing all data that is unnecessary for the browser to properly execute code.  This includes things like line breaks, comments, descriptive variable names etc. The minification process is all done by the wordpress server so that your JS + CSS files are still readable to a developer making continuous changes to the website.


Although the caching plugin should give you a decent performance boost, it alone isn’t enough to take us to the finish line. For our website, the biggest improvements to performance came from image optimizations. From the start, all images used on the site were compressed using TinyPNG.com before upload. This usually cuts down the file size by 20-80%.

High vs. low definition images

Here is where you might run into a slight issue. If you use lower resolution images for the sake of performance, they would show up blurry on bigger screens. If you use higher resolution (2x) images, they work great on larger screens but you’re sacrificing performance on smaller screens.

This is especially noticeable on mobile devices, where screen sizes hover around 400px wide. There is no point in serving a user (who may be on an expensive data plan) a 1600px wide image when the image will only display as 300px. If only there was a way to use different assets on different screen sizes.

…and of course there is.

The answer to this conundrum is something called srcset.


Srcset is an HTML5 attribute that is supported on most popular browsers. For the few that don’t, the browser will fall back on using just the `src` attribute. Srcset uses media-queries to decide on the best image to serve up. There are many detailed tutorials of how to use it online (such as this one), so I won’t go too in-depth here.

The nice thing about WordPress, is that it automatically creates image srcsets every time an image is uploaded, so this wasn’t too difficult for us to implement. For images added using WordPress editor the srcset is automatically applied to the images. If you’re using ACF (Advanced Custom Fields), like we did for TTT Studios, you can use the image ID to get the attachment image srcset using the built in function in WordPress. For a more technical explanation on how to do this, you can refer to this helpful blog post.

Lazy loading

The last image optimization we made was applying a lazy load to the images. Lazy loading is the process of not loading in an image until it is needed. Most of the pages on our website have this functionality built in.

Any offscreen images were set to lazy load, so that large images that aren’t yet needed on the initial page render don’t slow down the loading speed of the website. This helps desktop loading speed of course, but it also really helps mobile usability by not forcing users to download assets on their data plan they might not even use.

Third party libraries

At this point, some of the only complaints GTMetrix had left were from third party scripts and stylesheets. This can be hard to deal with since it seems like you have no control over them. In some cases this might be true, so it is recommended that you vet all third party libraries to make sure they are well supported and optimized before you become reliant on them. This is especially important for WordPress where there are many old plugins that may still have a lot of downloads, but haven’t been updated for the newest versions of WordPress.

We had one particularly power hungry widget slowing down our load times. We evaluated its use, and decided that it would be more beneficial for us to implement the behaviour ourselves. This of course isn’t always an option, but it is worth a hard look at all your plugins and scripts.

Possible improvements

With all of these optimizations we managed to get our page to a load time of 790ms (fully loaded time of 3.3s), 1.4MB  size and 56 requests. Quite an improvement!

Of course this still isn’t perfect. Here are a few areas we could continue to improve:

Deferring javascript

Deferring javascript allows the page to display to the user before all the scripts are up and loaded. The problem with this is most of our scripts were necessary on load to handle the initial transitions on our pages. Something you’ll want to consider, is the trade-off between performance and design. Both affect user experience, and it’s up to you to decide which matters more in a particular situation. In our case, the user experience of having smooth transitions and image animations was more important than the improvements we would see in load time by deferring the javascript.

Redirect chain from third party libraries

One of our third party scripts has a redirect chain to get to the most up to date version of the script. We could remove the redirect chain by self hosting the file within our website. However we would then have to be in charge of making sure we update that file anytime an update is pushed out. This tradeoff is up to you to decide if it is worth it.

CDN (Content Delivery Network)

A CDN is a system of distributed servers used to deliver content to users quickly based on geographic location. For instance our servers are located in the same region as our office, so anyone in our geographic area will be able to access our site quickly, while someone across the world will load the website slower. While a CDN can speed up load-times across the globe, it can be costly, and introduces an additional point of failure. Since our primary target is North America at the moment, we decided it wasn’t worth it for our website. Remember, you always have the option of setting one up in the future as needed.

The work doesn’t stop once the new website is live. Many of these decisions require you to consider the goal of your website, and what you need it to do. As well, your situation is likely to change over time, so it’s important to reassess every once in a while and adapt to those changes. Getting a high score on GTM metrics is one thing, but ultimately you should be aiming to deliver the best experience possible for your users.