Have you ever been aggravated when a website takes awfully long time to load? I am sure one time or the other we all had to sit and wait for a website to load that is slower than a snail walking on the back of a turtle.
Due to my personal experience dealing with slow websites I made a commitment to myself to create websites that load fast (usually under 1-2 seconds). I will be listing a few tips and tricks that I follow on how to create websites that load faster.
Have you ever been to a website with tons of images and realized that the website is slow? Yes, those ones. I can’t stress this enough, high quality un-optimized images can kill your website’s performance. High quality means high file size, which takes longer to download and it’s pretty self-explanatory. Don’t get me wrong, I am not saying NOT to use high quality images. You can still use them but if you optimize them and reduce the dimensions of the image to what is necessary then you will get a smaller file size.
For example, if you are trying to put a profile picture of you on your website and happened to take that picture on your iPhone that is 1080x1920 dimensions. However, you know that the size of your profile picture will not go beyond 360px, so using the picture at its original dimension is overkill.
I took the following picture of a dead cactus that has been sitting on my desk for over a year using my Galaxy S6 and it is 2.6 MB at full resolution of 2988x5312.
Then resizing it to 320x569 (proportionally) resulted in 55.1 KB. Can you see the difference? That is almost 2.5 MB decrease!!! And that is just with resizing.
If you optimize it using a third party tool (I use tinypng.com for quick optimizations and they result in awesome quality) then the result is even better. See the difference yourself:
As you can see the difference above, optimizing and resizing go hand-in-hand. You might be wondering why you need to optimize the image if you resize it, since I only saved about 10 KB.
Here is why: Let's add some effect to the picture in Photoshop and save it as a high quality image. Oftentimes when you find images online you will most likely run into high quality un-optimized images like below (you probably won't be looking for a dead cactus). Then optimizing the image will result in better outcome.
See the results yourself:
Optimizing images will save you a lot of space if the original image is high quality, as you saw above. Also, note that the higher the dimension of an image the higher the file size, naturally. Therefore, optimizing a high quality image and reducing the image dimensions to what is needed will improve the performance significantly. Try it out and see if it for yourself.
Word of caution when using async/defer: Loading scripts asynchronously via the async tag DOES NOT guarantee that the script will be loaded in the order as it appears. Therefore, only use async when the script does not depend on any other script file that is being loaded (for ex. jQuery plugins). Use defer in those cases: Defer causes the browser to parse the JS file after the HTML content has been loaded.
Avoid using scripts that are not used: I don’t know how many websites that I have seen that use scripts that are not used in that page. A common mistake, that can occur when using a template or plugin that is no longer in use, pay attention to those and remove them to save your website some extra bytes. It not only takes up unnecessary space but also slows your site down if they are being loaded and used by the DOM.
I am not talking about the server caching (although server caching could be important but that might not be readily available out of the box all the time) but I am mainly referring to caching in the browser.
One way to leverage browser caching is by using expiry headers. Depending on how frequent you would like a resource to be updated, you would set appropriate expiry headers so your browser can cache a particular (type of) resource for a specified amount of time before it is requested again. Leveraging browser caching will ensure that the browser does not ALWAYS request the resource to be downloaded every time a certain page, that the resource is being used, is loaded. That is ONE of the reasons that a page loads faster the second time you visit the same page.
For example, if you know for sure that images on your website will ONLY be updated once every month then setting an expiry header of 1 month for all image types (jpeg, png, etc.) in your htaccess or config file will let browsers cache all images for up to 1 month. Here is an example:
ExpiresDefault "access plus 1 month"
ExpiresByType image/jpg "access plus 1 year"
ExpiresByType image/gif "access plus 1 year"
ExpiresByType image/gif "access plus 1 year"
ExpiresByType image/ico "access plus 1 year"
ExpiresByType image/png "access plus 1 year"
ExpiresByType video/x-flv "access plus 1 year"
ExpiresByType application/pdf "access plus 1 year"
ExpiresByType application/x-shockwave-flash "access plus 1 year"
ExpiresByType text/css "access plus 1 month"
Gzip compression is a technique used by your server to compress certain resources so that their response size is smaller than what it would normally. It is very effective at reducing the response sizes and you should always use this feature if available. It is as easy as specifying the following in your htaccess:
# Compress css, plaintext, xml, gif, pdf and images in transport.
Avoid query strings
When you are developing a website, query strings comes in handy to avoid browser caching, which is OK, but when you go live you should avoid using query strings. Yes, I am talking about the resource links with ? in them. If you specify query strings for resources, that you would like cached, then it will not have any effect on it. Resources specified with query strings are not cached by browsers so you should remove them when going live.
Having those long comments in your JS files or even HTML/CSS files will surely increase the page size. Even using long variable names will add some additional page size as well. You should consider minifying CSS/JS files when going live. Here are some compressors that I use to compress my CSS/JS files:
- YUI Compressor - http://yui.github.io/yuicompressor/
- Closure Compiler - https://developers.google.com/closure/compiler/
- HTML Compressor - https://htmlcompressor.com/compressor/
CDN (aka, Content Delivery Network) is essentially a collection of networks distributed geographically across the globe. If you put all of your resources (sometimes even the entire website) within a CDN then it will make the resource look ups easier (by easier I mean faster) based on the distance between your end-user’s computer and the CDN node location (the closest location of a CDN resource container).
For example, if your webserver is located in Boston then if you try to access the website from Boston then you will get a faster response. On the other hand, if a user from California tries to access the same resource then the response time will be slower because the request has to travel from Boston to California and then the response from California to Boston. However, if you use a CDN then these problems can be eliminated because a good CDN network would have multiple CDN nodes all over the globe ensuring the same relative speed for all users (Please note that different factors such as the internal network latency, network speed, etc. might change these results). There are several CDNs out there such as:
- Amazon Cloud Front - https://aws.amazon.com/cloudfront/
- MaxCDN - https://www.maxcdn.com/
- CloudFlare - https://www.cloudflare.com/ (Offers a free plan)
I have compiled a list of tools below that you can use to test the speed of your website and see where the bottleneck is:
- Pingdom Speed Test - https://tools.pingdom.com/
- GTMetrix - https://gtmetrix.com/
- PageSpeed Insight - https://developers.google.com/speed/pagespeed/insights/ (also offers mobile insights as well)
- Think with Google - https://testmysite.thinkwithgoogle.com/