In our journey to make websites better and faster, we face many several challenges.
There is no denying that performance is really important and many websites struggle to achieve the level of performance that can keep up with today’s standards on different devices and networks.
Websites of today are more than just a page. We now have web applications that are packed with a variety of features, which sometimes tend to cause several types of performance issues. These issues vary; At worst they can make a website completely inept.
A slow website is extremely unfriendly to both search engines and users.
Retaining users and customer satisfaction
A crucial thing for every type of website is to retain their users and for that first impression is very important.
Sites that take more than 5 seconds to load can lose around 74% of users and on mobile it’s about 90%.
It doesn’t matter if it’s a blog, video streaming platform, or a search engine. A well performing website that loads just under 3 seconds will always have higher traffic, plus users are most likely to come back.
Impact on search engine traffic(SEO)
Search engines consider many aspects of a site like server response time, keywords, bounce rate, etc., while indexing the site.
Google has indicated that page response time(time to first byte arrives) is a big factor when it comes to ranking sites.
Google uses various algorithms to measure the value of a site.
A google crawler spends a limited time while indexing a site. If a website is slow then crawler won’t be able to index all of the pages and if a website has an exceptionally high bounce rate meaning that users leave after 5 second then google algorithm will rank it lower on index.
Sales and Conversions
A slow site can have a negative impact on growth and revenue.
Google case study showed that the chance of bounce increased 90% when a page load time went from 1s to 5s on mobile.
Many companies did case study in past to track their conversion rates and sales:
- AliExpress reduced load time by 36%, and recorded an increase of about 10.5% in orders, and 27% increase in conversion for new customers.
- Amazon did the case study where they found out that every 100ms of delay causes them 1% in sales.
Here are some methods to increase website performance
Reduce the image size
Images can have a significant impact on a website.
Imagine a blog website that has a featured page which displays title and thumbnail, and each thumbnail image is 1 Mb in size.That’s 10 Mb for 10 images.
Reducing the image quality can improve page load time. Thumbnail images shouldn’t be more than 20KB.
For icons and vector graphics always use SVG image format which is specifically designed for the web.
HTTP compression is used to compress html, css, and js files before it is sent from the server and decompressed on arrival by the web browser.
The main purpose is to reduce bandwidth and increase transfer speed.
There are many different compression formats, but the most used one is gzip which is supported by almost every server and browser.
Balance the server load
A single server can only handle a limited amount of requests.
A large amount of requests to a single server can cause significant response delays or no response at all.
The solution is to divide the amount of work between multiple servers and put them behind a special kind of server known as Load Balancer.
As the name suggests, a load balancer is responsible for balancing the load between servers by distributing tasks. So, if one server is busy the other one can handle the request.
Use HTTP/2 to reduce latency
Every time you request a webpage a new tcp connection is established with a server which takes the request and responds back accordingly.
For example let’s say that you have a html,css and js file, and within the html file you have links to 10 different image files. That’s 13 round trips in total. If it takes 200ms to make a connection and get response from a web server then it would require 2.6s to fully load the webpage.
That’s where HTTPS/2 comes in.
HTTPS/2 supports multiplexing and server push, which means that Instead of establishing a new tcp connection for each request, now you can request multiple files on a single tcp connection at once which considerably reduces latency and bandwidth.