DEV Community

Divya
Divya

Posted on

Do I have to rebuild my entire JAMstack site for every new change?

A recommended best practice on the JAMstack is to trigger a new site build and deploy for every new change that comes through. This ensures that sites are kept consistent across a globally distributed set of CDN servers. As you can imagine with this setup, build times increase as sites scale. In the case of data-driven industries like e-commerce where content (that significantly drives revenue) is ever changing, time is of the essence. In order to stay competitive and relevant, updates need to be instantaneous. The JAMstack approach of atomic deploys poses a challenge to larger sites like these with over thousands of pages since builds tend to be lengthy.

A key strategy to speeding up large site builds is by building incrementally. Incremental builds is the concept of re-building only the parts of a site that changed. To accomplish this, older content that doesn't change remains cached, while modified content is regenerated and updated to the cache. This way, deployments are perceived as instant. In spite of how ostensibly simple incremental builds may seem, they are in fact a tricky business. Most notably, incremental builds require pinpointing exact changes and updating the build appropriately. Because of the complexity of incremental builds, many SSGs have yet to implement them (without charging for them) and this technique remains fairly inaccessible for now.

A more accessible way to achieving faster builds is to decouple dynamic content like price from the prebuilt markup itself. APIs like Commerce Layer allow you to inject content at request time while serverless functions enables you to leverage dynamic server side functionality at the edge as needed. The combination of static with dynamic means that you can stay confident that time sensitive content is always up to date.

For more on incremental builds with regards to SSGs, check out React Static, an SSG built for React and the only SSG that currently supports incremental builds (for free)

Top comments (7)

Collapse
 
waylonwalker profile image
Waylon Walker

I like this idea of decoupling dynamic content to prevent too many incremental builds! I do it at work to avoid maintaining hooks into data sources that are tricky to track.

One thing I would like to see is a way of having one source that fetches data from the ssg and refreshes the data client side upon render. We wouldn't want to over fetch every source, but I definitely have some sources I don't route through gatsby because I don't control the changes, and I dont want to maintain the data in two places. Its fast enough to do it client side to I dont worry too much about it.

Collapse
 
shortdiv profile image
Divya

When you say fetches data from the ssg do you mean at build time ?

Collapse
 
waylonwalker profile image
Waylon Walker

Lol yes, not really worded well. 🤣

Thread Thread
 
shortdiv profile image
Divya

All good! Doesn't that setup still mean depending on fast builds for speedy updates on render?

Thread Thread
 
waylonwalker profile image
Waylon Walker

The json is hosted separately, and is live as soon as it's published. The data is fetched live from the client side on render.

I do have a short cache time on it to speed up refreshes, but still allow for near live data. Or use hard refresh to force it to fully refetch.

Collapse
 
nham profile image
Nick • Edited

Nift has had incremental builds since back in 2015, plus it's up to 4.5 times faster than Hugo at just full builds. A new version will be released in about a week where I've added a full type system including filestreams (will be useful for pagination), arrays, vectors, maps, user-defined types with template types like c++, scoping, constant and/or private variables/functions, user-defined functions, logical/relational/arithmetic/incremental operators, if/else-if/else statements, for/while/do-while loops, etc. etc.. A few bugs on windows have been reported since releasing the last version, so if you're on Windows worth waiting a week until 2.1 is out. The last few things needed for a good pagination solution will then be added for a 2.2 release.

Incremental builds can actually be reasonably easy to add in to an SSG. With Nift when I successfully build a page I create an 'information file' for it, whose main purpose is to keep a record of all the files that were used/involved with building the webpage (which I call dependencies). When I need to check if a page 'needs rebuilding' I check if any of the dependency files have been modified since the page information file, if they have the page is flagged for rebuilding. Other SSG developers are welcome to copy this, the better the JAMstack tooling the more people we can get involved.

This is probably the fastest technique for incremental builds, though does not always work well if development is not all on the same machine. For example if you make changes locally and push them without rebuilding them hoping to run an incremental build on Netlify's servers then unfortunately you wont be able to do what you want. However if you push changes without rebuilding and someone else uses git pull to pull the changes to their local environment then incremental builds will work across machines (perhaps Netlify could switch to pulling only the changes rather than the whole repo when a commit is pushed to a repo? It would certainly scale much better for larger projects, just have the servers use git pull).

To get around this potential problem, some other SSGs use other methods to check whether the content of a page has been updated and hence needs to be rebuilt. These methods are not quite as fast, but do sound like they're fast enough to handle projects with at least 100k pages, though I'm not sure what their full build times are like on 100k pages when you make changes to the templates used site-wide. Nift can build all of a basic 100k page website in under 12 seconds, and all of a basic 10k page website in a little over 1s. It also uses less than 500mb memory on a full build of 1 million pages, but unfortunately the near o(n) build times observed from 10k to 100k does not continue from 100k to 1 million, I get times closer to 6 minutes for full build with 1 million pages.

You can also use Nift to pre-build parts of dynamic websites (eg. php) as well. And can optionally use Nift's template language with other stuff like css/scss/sass, js files, etc.. And Nift works well with other template languages.

Collapse
 
raymondcamden profile image
Raymond Camden

When I first converted to static, I ran into this issue pretty early on. My site was 6K+ pages and every new piece of content made them all change. Why? Because I had a "Recent Blog Posts" pod in the layout. When I finally figured that out, I switched that pod to load the content via XHR and it helped tremendously.