My name is Sergey. I am CTO at Magecom. My article is based on a talk I gave at Magento Meetup #11, where I talked about the results of the work of our entire Magecom team.
Let's say you have a task to improve the performance of a site about which you still know almost nothing. For example, if this is a new client who came with a specific problem since the website on Magento 2 is working slowly.
The task that the team and I solved was assembling a set of universal solutions suitable for most projects. The main project requirement was to make solutions be with a minimum estimate and as automated as possible to save the estimate. The conditions under which solutions can be applied include a minimum set of project knowledge so that an engineer outside the project context can use these solutions.
Google PageSpeed Insights
To evaluate the results of our work, we use Google PageSpeed Insights.
It is a set of scripts for measuring metrics (but not only). First of all, I would say that this is a tool a client uses to confirm their feelings that their site is somehow slow. It means that the same tool can be used to show the client the effectiveness of your work in improving performance. There may be purely subjective feelings of the website speed, but the numbers are better.
In fact, the benefits of Google PageSpeed Insights are not over. Besides the fact that it can measure and visually demonstrate the performance of the site, but also gives recommendations on how to improve this performance.
How we see this process
Quite often, you can see such a life flow of a client’s project.
At the start, when Magento is configured, it has a small performance out of the box. Next, when the Nth number of modules and a theme are introduced, the performance decreases even more. At the go-live stage, the volume of content increases since there’re many products, pages, widgets on the homepage, and numerous categories in the main navigation menu ( nodes in DOM). As a result, the performance fails. After the go-live, we can see that the website works slowly, so we spend a couple of days on doing something with that, slightly improve the performance, but not even reaching the indicators at the beginning of the development process.
The second line is the hours spent on improving performance based on this approach.
How we would like to see this process
We would like to have everything to the maximum at the start of the project and during development. Without doubt, a slight drawdown is allowed, especially at the go-live stage, but after relatively little effort, we should return the indicators to normal.
But for this, you need to make a small investment (up to 8 hours) at the start of the project and during the development period.
My article is about what can be done during these 8 hours at the start of the project. And it doesn’t matter if you are starting a project from scratch or if it has just entered your company. Our task is to improve the standardized performance as much as possible.
How did we develop this procedure? We just took the path of least resistance and started checking everything to make it meet Google PageSpeed Insights and Lighthouse.
Prerequisites: measure performance
npm install -g lighthouse-ci
npx lighthouse-ci https://som.e/plp.html --filename=plp.html
The main rule here is that you should measure performance.
It's great that there are tools that allow you to do this automatically. It’s also good if you already have CI/CD configured. You can install Lighthouse CI, an npm package that can be run during pushes and deployments. It will give you a report during every pull request so you can always see which pull request is breaking performance.
Let's say we installed a module and added 150K UI scripts to the front of all pages, although it’s just a store locator. It is better to avoid the situation when you introduce a store locator and, without measuring anything, implement an in-store pick up on the checkout based on this store locator. Then it turns out that it is very slow, and you need to remove this whole thing, but the checkout is already built on it. Of course, it is better when you immediately receive notifications about this and can instantly fix it.
As a result, the first thing to start is setting up the performance measurement.
What Google PageSpeed recommends us
To evaluate performance, Google Page Speed Insights measures several metrics and also evaluates some site parameters that can significantly affect these metrics. For each such parameter, there are recommendations on how to improve it if it suddenly turns out to be in the red (and not only in the red) zone. Sometimes these are general recommendations; sometimes, they are personalized for a specific platform.
Do not load images that are not visible on the first screen
Do not load images into the browser that users won’t see on the first screen. You don’t need to load them until they scroll to them or until your slider starts scrolling through these images, . It is possible that they won't scroll to them, and there is no point in wasting network and browser resources to load them.
There are several solutions to this issue:
- The old way: go to the marketplace, find the extension with lazy load and install it.
- New way: since HTML5 supports the lazy loading of images, you need to add the loading=lazy attribute to the img tag.
At Magecom, we created an extension for this, which looks something like this if you skip details:
/**
* @see \Magento\Framework\Controller\ResultInterface::renderResult
*/
public function afterRenderResult(
\Magento\Framework\Controller\ResultInterface $subject,
\Magento\Framework\Controller\ResultInterface $result,
\Magento\Framework\App\ResponseInterface $response
): \Magento\Framework\Controller\ResultInterface
{
$content = $response->getBody();
$content = $this->imageProcessor->process($content);
$response->setBody($content);
return $result;
}
public function process(string $content): string
{
$closure = function (array $match) {
return str_replace('<img', '<img loading="lazy"', $match[0]);
};
return preg_replace_callback('/(<\s*img[^>]+>)/i', $closure, $content);
}
Before the server returns the html to the browser, we go through this html regularly and add the loading=”lazy” attribute to all IMG tags. That's it.
Reduce server response time
In fact, and it’s the most painful topic. If we skip the details, we can state that Magento is slow.
The main reason that is often mentioned in the context of this problem is the significant number of modules that are not the fact that will be used in the application. As a result, one way to improve this indicator is to turn off unused modules. You can find several reports and talks on this topic on various sites, and it was even mentioned on Mageconf. There are many solutions on how to turn off modules if you are not using MSI or GraphQL, for example. Lots of solutions.
The effect is obvious since the more modules we remove, the fewer configs will be collected, and the whole thing will work faster. The effect will be maximum if you remove the maximum number of modules.
To do this, we developed a module that does a little trick like this.
public function getUnusedModules(): array
{
$allModules = $this->getAllModules();
$requiredModules = $this->getRequiredModules();
$unusedModules = array_diff_key($allModules, $requiredModules);
ksort($unusedModules);
return $unusedModules;
}
private function getRequiredModules(): array
{
$requiredModules = [];
$modules = $this->configProvider->getConfig();
foreach ($modules as $moduleName) {
$this->addRequiredModules($requiredModules, $moduleName);
}
ksort($requiredModules);
return $requiredModules;
}
private function addRequiredModules(array &$requiredModules, string $moduleName): void
{
if (array_key_exists($moduleName, $requiredModules)) {
return;
}
$module = $this->moduleRepository->getModuleByModuleName($moduleName);
$requiredModules[$moduleName] = $module;
foreach ($module->getDependencies() as $dependency) {
$type = $dependency['type'];
$dependencyName = $dependency['module'];
if ($type === 'hard') {
$this->addRequiredModules($requiredModules, $dependencyModule->getFullName());
}
}
}
We create a whitelist of modules that are used on the project, for example, Magento_ConfigurableProduct, Magento_Checkout, Magento_Sitemap, add to this list all the dependencies of these modules (see the composer.json require section) and dependencies of those dependencies, and so on recursively, until we get the complete list of modules used on the project. And then, we remove all the remaining modules by listing them in the replace section in the composer.json of the project.
If we skip the details, the main methods here are getUnusedModules, which returns us a list of modules that are not used in the project. It takes all the modules, and those that are necessary, calculate the difference and returns a list of modules that are not required.
An interesting point is how getRequiredModules is calculated. It just implements the main algorithm of the module, which I described above.
It is a fairly simple script, but it allows us to get the most extensive possible list of modules that can be removed.
Preload Key Request
There are resources on the site that you may need from the very beginning of the page loading. It would be great if they were already loaded from the start and not delayed until the moment they appear in the DOM.
The solution may be as follows: you need to find these resources and add them to the xml layout through the link and specify the relation preload.
On web.dev, where this algorithm is described, you can see how to read it. Or you can start Google PageSpeed lnsights in the report to see which resources are fundamental to add to preload.
Minify CSS
This point is already implemented in Magento. All we have to do is enable minification. I suggest doing this at the stage when we launch the project.
bin/magento config:set dev/css/minify_files 1 --lock-config
For example, in our company, the configuration of the deployment project is automated, so it is added to scripts with builds. We have a project from the start in config.php that we minify scripts. If someone needs them to be non-minified locally in developer mode, they override the settings in env.php.
Minify JavaScript
If you don't have minification for JavaScript enabled, Lighthouse will not be satisfied and recommend you turn off default minification and use Terser. At the same time, if you allow minification in Magento, then it will be satisfied and will not notice that you are using Magento minification and not Terser.
Terser works better, but there is not much profit, so I don’t recommend using it just for this. Terser will be helpful to us a little later, and you can see it on Github.com.
There are other options if you do not want to install and configure Terser.
Eliminate render-blocking resources
The task is to display the first screen to users as soon as possible so that they start interacting with the site as quickly as possible. All we do not need on the first screen, we place in the footer and do not load directly from the start. The primary resources are CSS and JavaScript.
Google Page Speed Insights recommends doing the following steps.
For CSS, we need to enable critical CSS.
php bin/magento config:set dev/css/use_css_critical_path 1 --lock-config
There is a tool that allows generating critical CSS and determining which CSS is critical and which is not. It helps to separate it into files, and there is also an npm package for that.
As for JavaScript, moving them to the footer is recommended, so they do not slow down page loading.
php bin/magento config:set dev/js/move_script_to_bottom 1 --lock-config
As it turned out, the recommendations are pretty controversial.
Applying these recommendations slightly improves the First Contentful Paint score but significantly degrades the Cumulative Layout Shift. And therefore, if we measure the performance BEFORE and AFTER these settings, then we will see performance degradation at the end.
As a result, we do not apply these recommendations by default.
Keep Request Counts Low
The most well-known moment is Keep Request Counts Low. The number of requests from the frontend is huge, they need to be reduced, and everything is quite simple here.
We include merge CSS. Instead of hundreds of CSS files, one merged is loaded.
CSS:
php bin/magento config:set dev/css/merge_css_files 1 --lock-config
As for JavaScript, Magepack is an advanced bundler, has proven itself very well on projects, and is easy to configure.
JavaScript:
composer require creativestyle/magesuite-magepack
npm install -g magepack
bin/magento config:set dev/js/merge_files 0 --lock-config
bin/magento config:set dev/js/enable_magepack_js_bundling 1 --lock-config
magepack generate --cms-url="{HOMEPAGE_URL}" --category-url="{PLP_URL}" --product-url="{PDP_URL}"
magepack bundle
By the way, here we still need Terser because if we use the Magepack bundler, we first need to generate statics, and then, based on a specific config, we assemble bundles from these statics. And only after that do we already minify the bundles. And if we enable Magento minification, then we force Magepack to work with minified files.
Avoid excessive DOM size
If you have more than 3000 nodes on the page, then Lighthouse will definitely be dissatisfied with that. It cannot be fixed quickly, and there is no general rule for how to do this.
There are several indirect ways of.
- Disable Page Builder. Disabling it can reduce the number of nodes. But Page Builder is an important element of the online store, so this step should be discussed with the client. Perhaps the client paid for a Magento license to have a Page Builder here or hired a content manager with no knowledge of HTML just because there is a Page Builder. If there are no such strict requirements, you can try to stop using the Page Builder to reduce the DOM size.
- Load some content via Ajax. For example, you have 1000 categories in your store, and they are all on the menu. You load them, and inside, you have
- and 1000
- , and also in each one, for example, or other elements. By using Ajax you can leave only the top-level menu and pull up the lower levels.
Show images in next-gen formats
A new image format, WebP, has appeared, which is not inferior in quality to JPEG and also takes up less space. You can directly install any module from the marketplace that will convert on the fly.
There is an option to use the resources of various CDNs, such as Fastly and Cloudflare. They can convert your images to the WebP format and work better than modules. The CDN determines whether the browser supports the WebP format and if so, it converts JPEG to WebP on the fly.
Serve static resources with an efficient caching policy
Everything is good here by default because the default settings nginx have already configured the correct cache policy for all types of resources. The only point is if we start using the nginx config in WebP. Magento does not know about WebP, and you will need to add it to the appropriate section.
locator /media/{
try_files $uri $uri/ /get.php?$args;
location ~ ^/media/theme_customization/.*\.xml {
deny all;
}
location ~* \. (ico|jpg|jpeg|png|gif|svg|js|css|swf|eot|ttf|otf|woff|woff2|webp)$ {
add_header Cache-Control “public”;
add_header X-Frame-Options “SAMEORIGIN” ;
expires +1y;
try_files $uri $uri/ /get.php?$args;
}
location ~* \. (zip|gz|gzip|bz2|csv|xml)$ {
add_header Cache-Control “no-store”;
add_header X-Frame-Options “SAMEORIGIN” ;
expires offf:
try_files $uri $uri/ /get.php?$args;
}
add_header X-Frame-Options “SAMEORIGIN” ;
}
Remove unused CSS
As soon as we enable Merge CSS, Lighthouse starts complaining about it. We have one large file with all the CSS from the entire site, and on each specific CSS page, it starts diagnosing that Merge CSS files need to be turned off.
But with Merge CSS being turned off, the indicators are lower. Therefore, as with render-blocking resources, we ignore this recommendation.
Enable text compression
The server should give scripts and all text files in a compressed way (gzip). Everything is fine here — the nginx config gzips them by default.
There is also a small micro-adjustment, which is to enable the minification of html files. This setting will remove all extra spaces and hyphens. Of course, in this case, you won’t save much, but we’ll save an extra byte:
php bin/magento config:set dev/template/minify_html 1 --lock-config
Efficient image encoding
If the CDN has an image optimization function, and you load it with good quality, then it raises them out to web quality and gives the browser a minimum of content. There are also a bunch of extensions that do the same thing, and we recommend Apptrian. We use it often and have no problems with it.
Core Web Vitals
Google Page Speed Insights measures metrics in two conceptually different contexts.
Lab Data are measurements made in laboratory conditions on some standardized hardware, with particular characteristics, and the latest version of the browser installed, which is Google Chrome. Based on these metrics, you are given recommendations for improving performance.
Field Data are almost the same parameters but are measured on the devices of specific users.
Without a doubt, these data can be very different from each other. And the effect of applying these recommendations can be very different.
For example, suppose most customers use the site through the Internet Explorer browser (because these are some government agencies where outdated software can be used). In that case, they may not notice that you added the loading=lazy attribute to the images because this browser does not support this attribute.
By default, we consider that these recommendations apply only to lab data. It may take more time to analyze customer devices to improve performance for customers.
What's next
It is not a complete list of recommendations. Some we still haven't tested firsthand, and they're in our TODO backlog. But the effect of their corrections should not be significant because, as a rule, Google Page Speed Insights does not swear at them when we are dealing with Magento.
There are also some metrics (for example, reduce JavaScript execution time) that cannot be improved quickly, and you will have to dive deep into the project to understand how you can fix it.
Now TODO list looks like this:
- Properly size images;
- Reduce JavaScript execution time;
- Preconnect to required origins;
- Avoid multiple page redirects;
- Use video formats for animated content;
- Reduce the impact of third-party code;
- Avoid non-composited animations;
- Lazy load third-party resources with facades;
- Avoid enormous network payloads;
- Avoid chaining critical requests;
- User Timing marks and measures;
- Minimize main thread work;
- Ensure text remains visible during Webfont load.
Conclusion
As a result, we looked at some simple and reliable ways to improve the site's performance so that Google Page Speed Insights will also like it.
If you have other ways to solve the problem raised here, write in the comments.
Note: Google Page Speed Insights is just a web service. And the set of metrics is another tool called Lighthouse for simplicity. Google Page Speed Insights just uses this tool. But for simplicity, the article only uses the name Google Page Speed Insights, even when it's more about Lighthouse.
Top comments (0)