14 Rules To Improve Your Website Performance

desempenho do seu website

14 Rules To Improve Your Website Performance

In this article we discuss the major aspects of website performance. These topics are mainly based on the work done by Steve Souders, a web-site performance guru who has an enviable resume. He worked on major players like Yahoo and Google. You can also find more information about them on http://stevesouders.com/hpws. By the end of the article you should be able to understand the difference between making a beautiful website that can be irritably slow and non-responsive and a beautiful website that is fast and scalable.

Despite back-end optimization being critical for reasons of driving down hardware costs and reducing power consumptions, when a web-page needs improvement in terms of response time, the place to go is to optimize the front-end. Typically, on a traditional web-page about 80-90% of the loading time is used rendering the page from the provided HTML code. When analysing this reality, Steve Souders developed a set of 14 rules. When applied to standard websites, these 14 rules have the power to improve your website performance, reducing loading times by about 25-50%.


Figure 1 presents the loading time of components for a well known Portuguese news page, www.rtp.pt.
 

website performance rtp


As we can see in the picture, the time of loading the HTML is only about 10%. All the remaining time is used loading the remaining content as well as rendering contents and running scripts.

 

Speeding up

Without taking the right precautions when building your website, you could end up providing a sluggish service.

As time is money, providing a slow service, is not only a hassle for repeat visitors but will cause you to lose subscribers, customers and consequently your money.

When a person lands on your site for the first time, you only have a few seconds to capture their attention to convince them to hang around. If your site takes too long to load, most people are gone, lost, before you even had a chance.

"A one-second delay could result in 7 percent fewer conversions, 11 percent fewer page views, or even a 16 percent decrease in customer satisfaction" [2].

In order to improve your website performance, 14 rules are commonly used.

 

Rule 1 - Make Fewer HTTP Requests

This rule is simple and straightforward. One shall make fewer HTTP requests. You don’t want to compromise your design, however don't lose performance due to poor design. There are multiple solutions for this problem, among them the usage of Image Maps or CSS Sprites allows you to associate multiple urls with a single image. This way you won't need to make a request for every little image you use on your website, instead you will be able to map them from a global image sheet.

It is a good practice to use Inline images: use the data URI scheme to embed images directly within web pages. Data URIs are designed to embed small data items as "immediate" data, as if they were referenced externally. Using inline images saves HTTP requests over externally referenced objects.

Finally it is also important to combine scripts and stylesheets. By combining JS and CSS in one file only, the number of HTTP requests is greatly reduced and end-user response time is greatly improved.

 

Rule 2 - Use a Content Delivery Network

Content Delivery Networks such as Akamai are available in any part of the globe, they have revolutionized the way modern internet works. Rather than hosting your website on a single server, while using a delivery network, you can distribute your content files and workload across multiple systems, providing JavaScript libraries, HTML, CSS, fonts and other assets in the quickest and most reliable way for each individual user.

There are multiple reasons for using Content Delivery Networks, website speed is just one of them.

These networks are most of the times a performance booster, when a request is made by the user, the server closest to that user is dynamically determined, optimizing delivery speed. However they will be more effective when you have a popular public website with some type of static content.

Regarding disadvantages, when dealing with this technology the main issue is the high price and the increased complexity of deploying process of your website.

 

Rule 3 - Add an Expires Header

Another simple trick to boost your website's loading time is simply to add an expires header.

Expires headers tell the browser if it should grab the content from the server or from the browser’s cache. By implementing far distant expiry headers your browser will give preference for the content in its cache, reducing the number of requests and consequently page loading time [3].

 

Rule 4 - Gzip Components

Almost all modern websites implement some kind of compression. This is useful when applied to text (html, javascript, js), since images (usually png) are already compressed. The result of compression is remarkable bandwidth saving, so the site becomes more responsive.

Compression will slightly increase CPU usage, however this is not a problem, since the bottleneck is most of the times in the connection and not on the CPU usage.

“The reason gzip works so well in a web environment is because CSS files and HTML files use a lot of repeated text and have loads of whitespace. Since gzip compresses common strings, this can reduce the size of pages and style sheets by up to 70%“[4].

 

Rule 5 - Put Stylesheets at the Top

You should always put CSS on the top of the page. Putting stylesheets in the HEAD of your document not only improves page load times, it also avoids the browser having to repaint the page when later stylesheets are downloaded.

Some browsers block rendering in order to avoid having to redraw elements of the page if their styles change. This can lead to showing a blank white page to the user.

 

Rule 6 - Put Scripts at the Bottom

The scripts should be put at the bottom of the page. The main reason for this is to make sure other resources do not block until the script is run. When your browser is running a script it will block parallel downloads, even if they are available on a different hostname.

 

Rule 7 - Avoid CSS Expressions

This used to be an important rule, however modern browsers do not allow CSS expressions anymore. CSS expressions were great, however they had one major problem. They were evaluated far more frequently than most people think. Such expressions were evaluated not only when the page was rendered or resized, but also when it was scrolled or the mouse was moved on the page.

 

Rule 8 - Make JavaScript and CSS External

Inline CSS and JS look better and reduce the number of HTTP requests, however when being loaded from external sources they are cachable and won't need to be loaded every time. It’s an important tradeoff to consider. Most of the times it's better to have some additional requests in order to make the content cacheable.

 

Rule 9 - Reduce DNS Lookups

The Domain Name System maps hostnames to IP addresses. When you type an url into your browser, a DNS resolver contacted by the browser returns that server’s IP address. Like any service, DNS has a time price.

The browser won't download anything from this hostname until the DNS lookup is completed.

For any resource to be downloaded for your page, the browser must search for the correct IP. At least 1 request for each domain on your web page is needed.

 

Rule 10 - Minify JavaScript

There are a number of reasons why compressing your javascript files is a good idea.

Compressed JS has quicker download times for your users and it also reduces the bandwidth consumption of your website. On the other hand, it also reduces the number of HTTP requests on your server when combining many javascript files into one compressed file, thus reducing the server load and allowing more visitors to access your website.

Comments and whitespace are not needed for javascript execution; removing them will reduce file size and speed up script execution times.

Compressed javascript files are ideal for production environments since they typically reduce the size of the file by 30-80%. Most of the filesize reduction is achieved by removing comments and extra whitespace characters that are not needed by web browsers or visitors.

 

Rule 11 - Avoid Redirects

It's important to avoid unnecessary redirects. Redirects trigger an additional HTTP request-response cycle and delay page rendering. Considering the best case scenario, each redirect will add a single roundtrip, and in the worst case it may result in multiple additional roundtrips to perform the DNS lookup, TCP handshake, and TLS negotiation in addition to the additional HTTP request-response cycle. You will obviously need some redirects, but you need them every time, so get rid of the unnecessary ones.

 

Rule 12 - Remove Duplicate Scripts

Simple and straightforward. If they are duplicate they are not doing anything good for your website performance, get rid of them. Duplicate javascript and CSS creates additional requests and waste time since they are going to be evaluated more than once.

 

Rule 13 - Configure ETags

Entity tags are a unique identifier assigned to a specific version of a given resource on a web server. ETags are one of the several mechanisms that HTTP provides for web cache validation, which allows a client to make conditional requests.

The problem with ETags is that they won’t match when a browser gets the original component from one server and later makes a Conditional GET request that goes to a different server. This is common on websites that use a cluster at the server side. For busy sites with multiple servers, ETags can cause identical resources not to be cached, degrading performance. You should either configure them properly or even consider getting rid of them.

 

Rule 14 - Make AJAX Cacheable

Ajax has provided us some of the most awesome web applications and features we can think of. It provides a reliable feedback to the user because of requesting information asynchronously from the backend web server.

But AJAX does not guarantee the user will not need to wait for the asynchronous JavaScript and XML responses to return and sometimes this waiting time can be huge.

Its very important to optimize AJAX request/responses, but also to make it cacheable. This way you won't need to request the same thing to the server over and over again.

 

Conclusions

Despite simple, when combined together, these rules have the power to improve your website performance massively. Some of them may seem very easy to implement and will bring huge improvements in terms of response time, others may lead to hours of struggling and will end up being costly and not provide a solid increase in performance. It’s up to you to search for the tradeoff  between cost and performance.

 

Now go on and make your website faster!

 

References

[1] - High Performance Web Sites: Essential Knowledge for Front-End Engineers, Steve Souders, September 18, 2007

[2] - How Your Website Loses 7% of Potential Conversions, in https://www.clickz.com/clickz/column/2097323/website-loses-potential-conversions, available on 14th April 2016.

[3] - YSlow: Add Expires headers, in https://gtmetrix.com/add-expires-headers.html, available on14th April 2016.

[4] - Enable gzip compression, in: https://gtmetrix.com/enable-gzip-compression.html, available on 14th April 2016.

Others:

15 Easy Ways To Speed Up WordPress, in http://www.sparringmind.com/speed-up-wordpress, available on 14th April 2016.

7 Reasons to use a Content Delivery Network, in http://www.sitepoint.com/7-reasons-to-use-a-cdn, available on 14th April 2016.

5 Benefits of a CDN, in https://www.cdnetworks.com/blog/5-benefits-of-a-cdn, available on 14th April 2016.

Analyze your site’s speed and make it faster, https://gtmetrix.com, available on 14th April 2016.

Etags revisited, http://www.websiteoptimization.com/speed/tweak/etags-revisited, available on 14th April 2016.

Web Site Optimization: Maximum Website Performance, http://www.websiteoptimization.com, available on 14th April 2016.