DEV Community

Cover image for Migration of a Dynamic Website to a Static Website
Florian Rappl
Florian Rappl

Posted on

Migration of a Dynamic Website to a Static Website

Photo by Sai Abhinivesh Burla on Unsplash

In the previous two articles (Migration from Classic Hosting to Serverless and Migration of a Multiplayer Game from Hosted to Serverless) I've introduced you to my plan of migrating away from my dedicated server to a fully serverless infrastructure. This time I want to go into the plan in more detail.

Why Migrate?!

In general, I expect from this migration:

  • A more clean code base (finally I can clean up my stuff, maybe remove something and modernize some other parts)
  • No more FTP or messy / unclear deployments - everything should be handled by CI/CD pipelines
  • Cost reduction; sounds weird, but the last thing I want to have is a cost increase (today it's about 30 € per month for the hosting and my goal is to bring this below or close to 10 € - note: I pay much more for domains and these costs are not included here as they will remain the same).

There is a bit of background to this: Having my own dedicated server is something I was initially was happy about, however, over all the years the burden of properly maintaining this machine was a bit too high. I have quite some things on my plate and dealing with the (software-side) of a dedicated server was always on the bottom part of my ToDo list.

Over all the years the idea to move all parts of my server to the cloud certainly started to appeal. However, one thing that always stood in the way was email. I want (and need) my email to be also hosted, and right now the only option would be to set up a virtual machine (VM) at some provider... which is what I want to avoid badly! Not only cost VMs at the popular cloud providers more than my current dedicated hosting, that also would require me to be a mail admin. And trust me - the last thing I want to be is a mail admin. It's hard.

Now I needed to look for a solution as - for the second time since I have a dedicated server - the configuration I use is phased out at my hosting provider. Like the first time there is no comparable configuration available, so I need to get something more beefy (and more expensive). This time, however, I'll skip this mandatory upgrade; I move to the cloud.

This is already final - as shown below (German "Kündigung" means "cancellation" or "termination" of the service):

Quitting the dedicated server

Still, I first need to solve the problem of email.

Solving Email

I did a bit of research on the available mail providers:

  • Google, Microsoft etc.
  • Proton Mail
  • Zoho Mail
  • Posteo
  • Yandex
  • Fastmail

In the end I went with Fastmail. Why? Quite simple - it pretty much solves the core problem of my Cloud migration in a way that makes the whole migration work without much trouble. How?!

Fastmail offers the possibility of using custom domains. This way you don't get just a address, but rather a While most other mail providers also offer custom domain support Fastmail has for a budget of $5 a custom DNS on top of it - supporting up to 100 custom domains (enough for me) with free configuration. That's right - it's essentially a $5 DNS service with email.

But Fastmail does not stop there. One other problem would be that I register a custom domain such as but then I could only set a CNAME for leaving undefined. While most browsers will handle this scenario very nicely (with an auto-completion of the www subdomain) there is one browser that makes trouble here; but you'd need to get on a Safari to find it. How does Fastmail solve this root domain problem? With static websites!

On Fastmail you can create a redirect for the custom domain. So I can just make respond automatically with a 301 (permanently moved) to Ah and, of course, the website is automatically HTTPs protected by a Let's Encrypt certificate.

With email solved it's time to look where I've been and where I want to be from an architecture point of view.

Previous Architecture

Previously, everything was hosted on a dedicated server. The whole DNS service, mail and websites have all been served from the same machine, which also hosted the database. To bring updated content to the server FTP has been used. Any kind of configuration was done manually.

The following diagram illustrates this.

Previous architecture

There are three aspects that I want to see moving to a new architecture:

  • It should be more flexible, i.e., leaving room to use other languages and paradigms
  • It should be more efficient, i.e., being able to serve my page (or any other project) even faster
  • It should be more easy for me to grasp and maintain

All this should be fulfilled without going over the previous budget. Ideally, it should be even cheaper than beforehand. Just as a number: My page has around 40k views per month - so this is the load that I expect and that the new architecture should be able to handle this kind of load well.

Below I've sketched how I envision the new architecture. Configuration and updates in this scheme are all done via CI/CD. The mail provider also has an extensive API allowing me to send and receive emails automatically / have an AI assistant in between (more on that aspect later).

Anticipated architecture

This is, of course, quite high-level. The actual details, i.e., where each website that I have is rolled out and how these boxes connect in all detail are to be determined in a more low-level diagram.

In general I will not remove any website. I am a huge proponent of reliable URLs, i.e., I will not drop or change URLs intentionally. Anything that was on the web last year should still be on the web. One thing, however, is that not everything needs to remain as-is, but I am free to change (or simplify) if it keeps the previous content (mostly) intact.

One example is a website I did for a lecture I gave on software design patterns. For this I decided to make a dynamic to static conversion.

Dynamic to Static Conversion

For I wanted to remove the dynamic part. This was a bit difficult, as the whole page (every presentation and slide) has been generated dynamically by a custom CMS.

Instead, what I ended up doing is utilizing AngleSharp for transforming the existing (dynamic) websites into static files. I've stored them on disk and made them ready to be served statically.

The following script was used to get the initial download of the static pages:

List<string> downloadedUrls = new ();

async Task Main()
    var url = "";
    var target = "~/code/florian-rappl-patterns/public";
    await DownloadPage(url, target);

void CreateIfNotExists(string dir)
    if (!Directory.Exists(dir))

async Task DownloadAsset(Url url, string targetDir)
    if (!downloadedUrls.Contains(url.Href))
        var file = Path.Combine(targetDir, url.Path);
        var dir = Path.GetDirectoryName(file);

        var client = new HttpClient();
        using var stream = await client.GetStreamAsync(url.Href);
        using var fs = File.Create(file);
        await stream.CopyToAsync(fs);

async Task DownloadPage(Url url, string targetDir)
    // Don't download these - they wouldn't be useful and will be removed
    if (url.Path.StartsWith("Account") || url.Path.StartsWith("Slides"))

    // only download websites within the origin
    if (!downloadedUrls.Contains(url.Href))
        var dir = Path.Combine(targetDir, url.Path);
        var indexPath = Path.Combine(dir, "index.html");
        var config = Configuration.Default.WithRequesters().WithDefaultLoader();
        var context = BrowsingContext.New(config);
        var document = await context.OpenAsync(url);

        File.WriteAllText(indexPath, document.Source.Text);


        // download all stylesheets
        foreach (var link in document.QuerySelectorAll<IHtmlLinkElement>("link[href]"))
            var href = link.Href;
            await DownloadAsset(href, targetDir);

        // download all scripts
        foreach (var link in document.QuerySelectorAll<IHtmlScriptElement>("script[src]"))
            var href = link.Source;
            await DownloadAsset(new Url(href, url.Href), targetDir);

        // download all images
        foreach (var link in document.QuerySelectorAll<IHtmlImageElement>("img[src]"))
            var href = link.Source;
            await DownloadAsset(href, targetDir);

        // follow all links
        foreach (var anchor in document.QuerySelectorAll<IHtmlAnchorElement>("a"))
            var href = anchor.Href;

            if (href.StartsWith(url.Origin))
                await DownloadPage(href, targetDir);
Enter fullscreen mode Exit fullscreen mode

After the script was applied we have all the available pages downloaded and available for being served in a static website.

Using a search and replace I've also added a few enhancements such as using a file like jquery.js instead of the previously given URL /bundles/jquery?someid, which was originally leading to a bundle endpoint that performed some MVC magic.

Another thing I did via a search and replace was to transform the URLs for the UML diagrams (usually something like /diagrams/1d830940-8feb-4c70-b355-b5370cfcd825) to a proper SVG reference (/diagrams/1d830940-8feb-4c70-b355-b5370cfcd825.svg). With a bit more time invest I could have also renamed that properly, e.g., /diagrams/mvc-pattern.svg, but having the proper extension is good enough for now.

The result of the static-ification of the website is seen below. Full URLs transformed into a folder structure with an index.html. Surely, a bit of a nicer transformation would use something like Astro with proper re-use, however, the effort for such a transformation would have been quite higher. If the website would still be actively used or I'd envision some progression here in the following years I potentially would invest the time, but right now it does not seem to be needed.

Structure after static-ification

The deployment is done via an Azure Pipeline:

- master

  vmImage: ubuntu-latest

- group: deployment-tokens

- task: AzureStaticWebApp@0
    app_location: '/public'
    api_location: '/api'
    skip_app_build: true
    azure_static_web_apps_api_token: '$(patterns-token)'
Enter fullscreen mode Exit fullscreen mode

The deployment is done to Azure Static Web App. This is perfect for the scenario at hand; a mostly static website with a few APIs (in this case exclusively used for the search). Note that the free tier of Azure SWA is used, i.e., this is another area of my website that is now running at essentially zero cost.


It runs - faster and more cost efficient (for the given subdomain no additional costs will occur). The crucial part was to identify a way of providing the content in a mode that fits its purpose best.

Making the patterns website static was the right choice - the dynamic nature of this page was not longer required. All the material has been created and the individual HTML files are sufficient for keeping the previous user experience alive.

In the next post I'll look into the migration of a game (Mario 5) with its level editor and backend API.

Currently, the dedicated server is still operational - but I need to finish the migration until the end of the year.

Top comments (2)

efpage profile image

Sounds appealing if your website was a mainly static page delivered by a custom CMS. But there are 2 questions for me (maybe I missed something?):

  1. How do you deal with page changes? Your initial CMS was used to organize the content. You will probably not be able to change anything in the static output files, as they have been created by the CMS and are usually hard to read. Is there a way to do all the convertion in an automated toolchain to keep your content living?

  2. Unless you use only CSS to make your content dynamic, you will probably have some javascript code. Can you transfer your javascript and run it on the final static files without manual changes?

I was thinking about a way to store static pages from a CMS automatically, but this parts are kind of tricky to preserve. If your page is highly interactive, it may also be complicated to distinguish the static parts from the dynamic elements.

florianrappl profile image
Florian Rappl

Have you read the other articles in the series? The patterns example is one of many. In this example the source was not updated in almost a decade and I do not need the dynamic part. For other parts (especially my page which is covered in an article to come) you'll need a different strategy.

Regarding 1:

If you have a scenario where parts might still change I'd advocate into conversion to Astro or to use the dynamic / CMS part only locally and make the conversion upon build from there. This way, you keep the dynamic parts - run them on demand locally and then have a step that uses the local source for generating the static content. The appeal of this approach is that its essentially the same effort as what was discussed in this article. Surely, you could spend more time and improve this process - the question (as usual) is: Is it worth it? Would a complete refactoring be better if you really want to remain flexible?

Regarding 2:

And yes, in this example the downloaded assets (JS/CSS) have been kept unchanged and just work except the search endpoint, which was changed. Surely, you cannot generalize from this (there could be references) but in most cases there won't be any problems.

For more details on changing a more dynamic code see, e.g., the second article which creates a hosted container or the first one which starts from a static page with a search API already and transforms this Hint: The current one uses the same strategy except that it also had to create its static assets first as a custom CMS was the source.