DEV Community

Cover image for The page flickering when using feature flags and A/B testing tools
Marvin
Marvin

Posted on

The page flickering when using feature flags and A/B testing tools

As developers, we constantly strive to improve our applications. We fix bugs, implement new features, and prioritize user experience. These updates and enhancements are critical for the success of our applications and need to be rolled out in a timely manner.

Feature flags are an effective tool for minimizing risk and reducing lead time. By creating code behind a flag, we can seamlessly deploy updates and toggle the flag when the new feature is ready for release. This allows us to quickly and easily make changes to our applications without disrupting the user experience.

As with any technology, there are limitations to using feature flags (or A/B testing) in JavaScript applications, including those built with the JAMStack. One common issue that can arise is “page flickering,” which can be frustrating for both developers and users.

The “Page Flickering Problem”

Let me illustrate what is the “page flickering” problem:

Timeline of a feature flag resolution

The previous picture illustrates what happens when using feature flags (or A/B tests) in a classical JavaScript application, including JAMStack ones. Here are some explanations:

  • A user makes a request to a website
  • They get back the /index.html file of it (empty for client only application, and already filled for JAMStack applications)
  • The browser parses the HTML file and prepares every scripts it has inside
  • The website is now visible (empty or filled)
  • The frontend application kicks in and the site is interactive
  • The JavaScript application fetches the feature flags from the according (remote) service
  • It gets back the flags and updates the application, showing the new version with / without the flag.

In the previous picture, two bold-ish red squares are highlighted. Their written equivalences, which are also highlighted in bold, are explained in the text above.

My question is: For the end user, what happens between the moment the page is displayed and the moment when the flag resolves?

The answer: The old version, UI is moving, the new version, in that order.

How people generally mitigate this behaviour?

Hopefully, there are multiple solutions available on the internet to deal with feature flags so that you don’t have to (such as Progressively).

Spinners, loading indicators

To address the problem illustrated in the previous picture, one solution is to hide all the website’s content during the flag resolution and display a spinner or loading indicator. On subsequent page loads, the browser storage can be used to quickly show the correct variation and adjust over time.

While this solution may be suitable for applications and dashboards, it may not be the best approach for marketing websites where we want the pages to load instantly. In these cases, hiding the website’s content and displaying a spinner or loading indicator may not be acceptable.

Shifting resolution to the infrastructure layer

Upon closer examination, we can see that the problem stems from the fact that applications only run after the browser has already displayed the page to the user. Specifically, the script tags in the HTML document are only executed when the HTML document is loaded in the browser. This results in a delay between the time the user sees the page and the time the flag resolves.

However, there are things that run before, even if we don’t rely on a dedicated server that renders pages (think server side rendering).

The infrastructure layer is a place where we can do things, ahead of time.

I’ve written a post, some time ago, about A/B testing with the JAMStack and how we can achieve this at the infrastructure layer.

Using Server Side Rendering

By recognizing that the problem is caused by running the flag resolution in the browser, we can instead move this resolution back to the server.

With tools like Next.js, Remix, or any other framework that supports it, you can generate the valid HTML document for each request, with the content behind a flag variant already resolved.

Progressively can also assist in this situation.

What would be the best way to fix this issue?

This is subjective, and my personal opinion. It mostly target situations with high traffic where hitting the server hosting the application might cost a bunch of money.

Just as I design APIs by first defining the desired outcome and then implementing the details, I will first specify what I want and how to achieve it, keeping in mind that client-side only and JAMStack applications may not behave as desired. This approach allows me to design the solution before diving into the details.

What I want:

  • A website that does not blink
  • A website that does not cost a lot of money
  • A fast website

A website that does not blink

One solution to this problem is to use a technology that employs Server Side Rendering. With this approach, the page is generated on the server and delivered to the user in its completed form, allowing for instant display in the browser. This can greatly improve the loading speed and overall performance of the site.

My preferences for my ideal world would be Next.js, I will explain why, but you can use any tool that supports SSR, it will work great.

A website that does not cost a lot of money

The JAMStack is truly amazing, especially when paired with a free CDN like Github Pages. Plus, it won’t cost a dime!

But on the other hand, the use of a Server Side Rendering tool would require some expenditure. However, with the help of a CDN and aggressive caching of requests to reduce the load on the origin server, which is the costly part, it can be made affordable.

1, 2, 3…

The introduction of the CDN has just broken the feature flags functionality. This is because the CDN caching is applied at the URL level, so when requesting https://website.com/, all the audience will receive the last version cached by the CDN.

However, it exists solutions such as Edge Middlewares in Vercel that allow to run code before reaching the CDN cache, at the edge. And this is a perfect place to resolve the feature flag (and A/B tests) variants.

If you are interested, this blog post shows how to use Progressively and Vercel Edge Middleware together.

A fast website

A Server Side application is now being run at the edge on Vercel, with feature flags being handled at the edge using Edge Middleware and cached by a CDN.

This technology stack is known for its speed.

The last mile

Remember my decision to use Next.js earlier? The reason is because, as mentioned earlier, CDN caching can only work at the URL level, and the same is true for edge middlewares. You can’t modify the internal of the document in edge middlewares since they run before the CDN cache.

But one interesting thing that came in Next.js 13, are the support of React Server Components (RSCs). As the name suggests, RSCs run on a server, somewhere, and are probably available through a URL.

What if, in the future, Edge Middlewares were able to process only requests that would aim to resolve RSCs?

Latest comments (0)