As a work of multiple hands, the document describes six areas of interventions allowing to limit errors and possible obstacles to the site’s performance (both generally speaking and on the search engine). More specifically, the list of suggestions focus on:
- To apply instant loading with the PRPL model.
- To pre-upload critical resources so to enhance loading speed.
- To remove any unused code.
- To minimize and compress network payloads.
- To offer a modern code to modern browsers to achieve a faster loading of the pages.
PRPL pattern for istant loading
It has been an article written by Houssein Djirdeh that opened up the way for further tips and ideas, the same Googler that already guided us through the path of optimization of site images: this time around we talk about PRPL pattern and istant loading, two elements that can really make a difference in terms of site speed.
The PRPL acronym describes the model used to upload and make interactive the web pages, that shapes four techniques that can be applied together or independently so to achieve performing results:
- Push (or preload) – Launches (or preloads) the most important resources.
- Render – Renders the initialling path the quickest way possible.
- Pre-cache – Pre-loads in the cache the remaining resources.
- Lazy Load – A slow loading of other minor paths or resources.
Using Google Lighthouse to check the page
The first step in order to identify any chance of improvement of a page through PRPL techniques is to launch Google Lighthouse: if a specific resource is analyzed and retrieved with significant delay, the tool pops up a “check failed” we can intervene on.
The preload is a retrieving request that indicates the browser to extract a resource as soon as possible. It is possible to signal critical resources by adding a <link> tag with rel=”preload” in the head of the HTML document, so that the browser can set the proper priority level and try to download the resource quicker, so to avoid the delay of the window.onload event.
How to avoid to delay First Paint
Lighthouse signals with a warning whenever it finds resources delaying the First Paint, the moment in which the site actually displays pixels on the screen: there is no such thing as a single correct solution to reduce First Paint in the app and the integration of styles and renderings could be needed on the server side if the benefits outweigh compromises.
Tips to speed up the site
By working as proxy, service workers can retrieve the resources directly from the cache rather than the server during recurrent visits: this not only allows users to use the application when they are offline, but also makes the page’s loading time much quicker on recurring visits. This whole process could be simplified by using a third-party library, unless you have more complex cache requirements than the ones a library can supply.
Preload of critical assets
The preload of critical assets guarantees that the most important resources will be retrieved and downloaded first by the browser. When a web page opens up, the browser asks for the HTML document from a server, analyzes its content and sends separated requests for each and any referring resource; the developers already know every resource needed by the page and which are the most important ones, hence could ask for critical resources beforehand so to speed up the whole loading process.
Preloading works for those resources generally discovered late by the browser, to which instead we now assign a priority; to mark the preload we can add a <link> tag with rel=”preload” in the head of the HTML document, so that the browser can memorize the resources inside the cache and make them immediately available when necessary.
Differences between preloading, preconnect and prefetch
Unlike preconnect and prefetch, for instance, that are suggestions performed by the browser in the way it deems fitted, the preload is mandatory. Modern browsers already are quite able to choose the right priority for resources, and that is why we need to wisely use the preloading, by applying it only to the most important resources and always verify the actual impact with real tests.
Removing the unused code
To minimize and compress network payloads
There are two specific techniques to enhance the performances of your web page, the minification and compression of data, that can reduce the sizes of useful weight and, consequently, enhance the page’s loasing times.
Data compression works through an algorithm that indeed compresses the resources so that the files can be limited in a dynamic or static way: both approaches have their pros and cons, so whatever suits your case the best will be the perfect choice.
In the dynamic way, resources are compressed on-the-fly whenever requested by the browser: this can be easier compared to manual or compiling processes, but could cause delays if high levels of compression are used. Static compression reduces and saves resources beforehand: compiling process could take a bit longer, but there will not be any delays when the browser finally retrieves the compressed resource.
A modern code for modern browsers
A tool that allows to compile a code including a more recent syntax and turn it into a more understandable code for different browsers and environments is Babel, whose counterpart’s name is Lebab (a.k.a simply Babel written backwards), a separated library able to convert the older code in a newer syntax.