Is your website (or web-based application) not performing like you expect it should? Not sure why or what to do about it? Then come learn about some key things you can do to have your website perform properly.
If there’s almost one universal constant in the modern world, it’s that business need websites. The sites come in many shapes and sizes.
There’s the old-style “brochure” website, that just says where you can find them, their opening hours, and the basics of their service offering. They’re nothing special, but they’re at least a point of presence.
Then there’s the brochure site combined with a blog, something that you can do pretty easily with tools such as WordPress, Drupal, and Joomla, among others. It’s still pretty simple, but the blog keeps it fresh and up to date.
Next up, there are ecommerce sites, whether from a very small business, or for larger concerns. And finally, there are custom-designed web-based applications, the ones that let people do all manner of things.
Yet, sadly, regardless of the type of site, they seem to increasingly have one thing in common: they have lackluster performance.
In the modern world, where website and web-based applications are effectively a must-have for any business, sole-trader, government department, NGO, etc, it’s appalling to me that they can perform so slowly.
What appalls me more, is when you look at a lot of them, as I’m increasingly spending my time doing, the reasons for the lackluster performance aren’t anything magical.
As an external consultant, I don’t often get to review the code behind the websites. I can only look from outside in most of the time. Regardless, there are a number of things that are all too often sorely lacking.
So, in today’s post, I want to set out what I consider a basic set of principles that all websites-regardless of scale and purpose-should follow, if you want them to perform properly.
Most sites these days are content-rich, relying heavily on images and videos. This can be for any number of reasons, including:
Below are two examples. The first is of real estate agency McGrath, where high quality images are essential for showcasing properties to potential buyers and investors.
The second is the Australian electronics retailer, JB Hi-Fi.
However, from the growing number of sites I’ve been reviewing, a large proportion of that image and video content is not optimised. As an example, one website I’ve been reviewing lately, I’ve been able to optimise most images by up to 35%!
If you extrapolate that out, if the average image size in the site was 200k, then a 35% size reduction will reduce the file to 130KB. Sadly, an average size of 200KB is on the conservative size. A back of the napkin calculation showed that the average image size was around 548KB.
Anyway, back to the discussion at hand. “It’s only 70k”, you may be thinking. What’s all the concern about? Well, how many images are being served by your site?
Say you have around 20 images, that’s 1,400Kb per/visitor, assuming that they go through the entire site. But Matt, the browser caches the images. Yes. You’re right. After the initial site load they do — not before.
As Steve Jobs knew all too well, and as the old saying goes, first impressions count! Consider how slow your site seems to new visitors on that first visit? Are they wondering how long your site’s going to take to finish loading?
We all know the statistics. They’re not going to hang around long - at all. What’s more, speed is now one of Google’s core ranking factors.
So, now ask yourself, how many potential visitors are you losing by having them wait unnecessarily long for images and videos to load?
But that’s not the only consideration. For every image that isn’t optimised:
But that’s not all! In a world where mobile-responsive websites are ever-more the norm and 3G is still the average mobile network speed, the larger that your image and videos are, the poorer the performance mobile users will have with your site.
As a little exercise, have a look at your current hosting costs, and do the math to find out what you’ll save if you start optimising images and videos. If nothing else, your bottom-line will thank you!
There are a number of desktop applications and scripts that you can use, such as ImageOptim.app on macOS, jpegoptim, and OptiPNG on Linux, and pngquant and FileOptimizer on Windows. For video optimisation, try Blazemp, FFmpeg, or HandBrake.
You can see an example of ImageOptim below, optimising two tickets I purchased recently.
Use any one of these to optimise your images and videos before they’re uploaded. Alternatively, integrate the tools into your website’s deployment pipeline, so that they’re optimised automatically.
Yes, it will take a little more time at first, but it’s worth it, and it will save you so much in the long run.
GIF, JPEG, and PNG have been around for years now. They’re the image format workhorses of the internet. However, just because they’ve been around for years, doesn’t mean that they’re still the best choice.
Your user still sees the excellent content on your site, but in less time. What’s more, you don’t need as many server resources, so don’t have to pay as much in monthly/yearly hosting costs.
Now, not all browsers currently support these formats. However, an increasing number of them do. Below are links to browser support for each of the three image formats.
But it doesn’t matter if one or more browsers don’t support the newer formats. This is because you can code your site to let the browser choose what format to serve to the visitor, based on what it supports. You can see an example of how to serve an image in WebP, falling back to PNG format below.
<picture> <source srcset="images/website-logo.webp" type="image/webp"> <img src="images/website-logo.png" id="website-logo" alt="Website Logo"> </picture>
In this example, if the user’s browser didn’t support WebP, it would be served the image in PNG format instead. Yes, it’s a little more work, but it’s worth it!
I’ll admit, this is a massive pet peeve (frustration) of mine. Why, oh why, do developers still do this? I’ll be uncharitable and say that I can only put it down to laziness or tardiness in the presence of crushing deadlines, which are, sadly, all too common in modern software development.
Regardless, don’t do it! Sure, the image might look small to you on the screen. It might render at the right size for your site’s design. But that’s irrelevant.
The server’s still serving the image at its original size. And the client’s still downloading it at the original size, before resizing it on the fly (each and every time).
Don’t believe that this happens? Here’s a recent example, with the names changed to protect the innocent (or guilty, depending on how you look at it). I regularly found images that rendered at around 250 - 300px wide actually being around 4032px wide.
Doing a bit of quick math, that means that the image is being rendered at around 9.5% of the original size. So an image over 10x the size that will be used has to be downloaded, only for 90+% of it to be, effectively, immediately thrown away.
Why?! Please. Don’t do this. Take a little bit more time to plan out the use of your image and video files, scale them accordingly, and save your users from poor performance.
To be fair, this might be quite a bit of work. If so, then put a script, microservice, or background job in place that creates a number of copies which are optimised for the locations where they’re used.
Header compression, as the name implies, is where a client can request, via the use of headers, to receive content in a compressed state. The client and server then negotiate with each other as to the compression format to use. After that, if the server supports it, the server compresses the content and sends it to the client. Finally, the client decompresses it on the fly, before rendering it to the user.
The client then knows which compression algorithm was used, and can uncompress the content appropriately. If that header is missing, then the content is sent back, uncompressed.
Yes, there are a couple of steps involved, but they’re all transparent to the end-user. As an experiment, from the command line, run the following command:
curl -I "Accept-Encoding: gzip" https://<your_domain>/
If it is, then you’ll see
content-encoding: gzip in the output.
Here’s an example from this website:
HTTP/2 200 date: Fri, 27 Sep 2019 07:58:41 GMT content-type: text/html last-modified: Fri, 20 Sep 2019 10:51:23 GMT vary: Accept-Encoding etag: W/"5d84af2b-64b1" strict-transport-security: max-age=31536000; includeSubdomains; preload x-xss-protection: 1; mode=block x-frame-options: SAMEORIGIN x-content-type-options: nosniff content-encoding: gzip
Alternatively, you can use the developer tools in your browser of choice, as you can see in the image below, or a tool such as [online gzip test].
There are a number of common compression algorithms in use, with the most common being gzip and deflate. I currently use gzip compression as it’s very flexible and has greater reliability than deflate. Compression can be applied lightly, or you can eek out every last saving you can, by setting the compression level to 9. «flesh out this a bit further»
I’ve included an example of how to configure GZIP compression with NGINX for all content types, below:
And here’s how to configure it with Apache
Taking again, my website as an example. The HTML content of the home page weighs in at 25.2KB. However, by using compression, only 7.8KB is sent by the server, which is only 30% of the original file size.
I’ve spoken a lot up till now about image size, and image optimisation, but what about being able to specify if a content item needs to be sent at all? That’s where caching comes in to play.
So, in addition to optimising and compressing content, you need to make use of caching headers. That way, the client can determine if it needs to request the content at all. Because, if the server tells it that a static content item hasn’t changed since the last time a request to the site was made, then there’s no need to download it again.
An in-depth discussion of caching headers is outside the scope of this article, however Heroku has an excellent guide, if you’re keen to know about it in greater depth. For now, there are four headers that you need to familiarise yourself with:
If you’re a veteran of the IT industry, this might sound wrong. After all, there’s an old saying in tech: if it ain’t broke, don’t fix it. In many instances, that can be perfectly valid. But in software languages, it’s often wrong.
Here’s why. Often times — especially in PHP — the most recent versions of a language:
Of this list, for this post, the first two are the most important. Take PHP as an example. Official PHP benchmarks report show that it can execute twice as many requests per second when compared to PHP 5.6.
How many more visitors could your site serve and how much lower would your hosting resources be with that kind of performance improvement? It’s a double-win.
Actually, it’s even more so, when you consider that virtually no code changes would be required, in most cases. Yes, some older applications and sites will need to go through a careful migration.
No matter what server software you’re using, whether that’s NGINX, PHP-FPM, Apache, Tomcat, IIS, or whatever else you use to power your website, make sure it’s up to date. As with having the latest software language, keeping your server software offers the same set of benefits.
It’s outside the scope of this article to cover the changes in any one server software, let alone all of them. So I strongly encourage you to review the changelog for the latest, along with the most recent versions of the server software you use, see what’s changed, and then make the upgrade.
Yes, there will be circumstances which may limit the extent to which you can upgrade, but go as far as you can.
Those have been seven ways to speed up the performance of any website or web-based application. It doesn’t matter whether you’re running a small brochure site or a custom web-based application, each of these recommendations can help:
No matter which way you look at it, these are goals worth achieving.