DEV Community

Cover image for Migration of Mario 5 to Serverless
Florian Rappl
Florian Rappl

Posted on

Migration of Mario 5 to Serverless

In the previous three articles (Migration from Classic Hosting to Serverless, Migration of a Multiplayer Game from Hosted to Serverless, and Migration of a Dynamic Website to a Static Website ) I've introduced you to my plan of migrating away from my dedicated server to a fully serverless infrastructure.

This time I want to cover how I migrated the Mario 5 game incl. it's social features and level editor into a serverless API.

The Mario 5 game is based on an older article from me, see:

It's hosted on my server now for over a decade, but now it's time to migrate it to the cloud.

Why Migrate?!

Quick recap: What do I expect from this migration:

  • A more clean code base (finally I can clean up my stuff, maybe remove something and modernize some other parts)
  • No more FTP or messy / unclear deployments - everything should be handled by CI/CD pipelines
  • Cost reduction; sounds weird, but the last thing I want to have is a cost increase (today it's about 30 ā‚¬ per month for the hosting and my goal is to bring this below or close to 10 ā‚¬ - note: I pay much more for domains and these costs are not included here as they will remain the same).

There is a bit of background to this: Having my own dedicated server is something I was initially was happy about, however, over all the years the burden of properly maintaining this machine was a bit too high. I have quite some things on my plate and dealing with the (software-side) of a dedicated server was always on the bottom part of my ToDo list.

Now here we are: In a migration of all my resources from a dedicated server to a serverless environment. Right now there are only two things left:

  1. A game found at mario5.florian-rappl.de
  2. The homepage itself florian-rappl.de

While (2) will be a larger refactor we will start with (1) to see how this (incl. its database, login features etc.) can be migrated to the cloud.

Code Evaluation

Originally, the code for the Mario 5 website consisted of an ASP.NET MVC website with a three JavaScript files - one for the game itself, one for the editor, and one to enable / disable / connect buttons that are available on the website (outside of the game or editor). Each website was fully server-side rendered. The ASP.NET MVC web app used a MySQL database for connection.

When the website is accessed it will render a page just like shown below:

Homepage

This page leads to multiple options. One interesting option is to create your own levels using the level editor:

Editor

Of course, most often people are just interested in playing the game. By default, i.e., without selecting a custom level, the "campaign" is launched:

Game

The whole architecture is shown below.

The original architecture

For this part of the migration I wanted to fully convert the Mario 5 website into a static web app. Therefore, I needed to:

  • Convert the necessary API functions into Azure Functions
  • Convert the application as a whole into a SPA
  • Convert the MySQL database for a DB as a service (Azure Table Storage) friendly format

The anticipated architecture of this application after the migration / conversion is shown below:

The new architecture

There are a few parts we will get rid off:

  • Translations; surely, I could convert them, too, however, this would be rather cumbersome and I actually have no use for the German texts anymore - English all the way
  • Social sharing; this plugin / script is outdated, looks ugly and is essentially useless
  • The original .NET OAuth implementation - we'll use the Azure Static Web App authentication with GitHub and Microsoft accounts

The latter was done via a custom provider that also allowed you to register an arbitrary account:

Account

We want to replace this with a generic OIDC / OAuth 2 flow using Microsoft and GitHub accounts. This is supported out of the box in Azure Static Web Apps and does not require any development on our side. This way, there is no need to manage accounts.

Most importantly, however, we'll need to convert the database.

Database Conversion

The central part (or why this is/was server-side rendered at all) is the database of the application. The database holds information regarding the registered users, the custom levels, and the ratings from users to the levels. Overall, the database contained 11 tables.

Of the 11 tables we only need to convert 2; the other 9 are related to the ASP.NET Membership functionality that we'll drop anyway.

The tables to migrate

For the actual conversion we can just use the JSON export of the phpMyAdmin portal.

The JSON export view of the levels table

The JSON export results in a table view that looks just like the following piece:

[
{"type":"header","version":"5.0.3","comment":"Export to JSON plugin for PHPMyAdmin"},
{"type":"database","name":"mario5"},
{"type":"table","name":"ratings","database":"mario5","data":[]
}
]
Enter fullscreen mode Exit fullscreen mode

The data field is an array consisting of the actual table rows. A row can look like the following:

{"userid":"3","levelid":"10","stars":"5","comment":"Cool!","created":"2012-07-31 20:59:09"}
Enter fullscreen mode Exit fullscreen mode

The job is to bring those rows to Azure Table Storage - such that we can use it with the converted application. For this, I wrote a little script that reads the exported tables and generates new ones. My idea was to have three tables:

  • levels, which is the table with all the available levels (partition is userId, key is id and new fields are userName and rating)
  • ratings, which contains the individual ratings for all available levels (partition is userId, key is levelid, and new field is userName)
  • users, which is a new table (partition is provider, key is userId, fields are userName)

The essential idea behind these new fields is to provide everything in a single query. This way, Azure Table Storage remains super fast. The downside can be seen on the new rating field; while this avoids computation on a query, it requires re-computation when a new rating is created. This is a challenge for data consistency - but overall a sane choice in our migration scenario.

To migrate the existing (exported) tables into new tables I've written a small script that processes the JSONs and yields some CSV files that can be imported directly into Azure Table Storage:

const { writeFileSync, mkdirSync } = require("fs");
const { resolve, dirname } = require("path");

const levels = getRows(require("./src/levels.json"));
const memberships = getRows(require("./src/my_aspnet_membership.json"));
const users = getRows(require("./src/my_aspnet_users.json"));
const ratings = getRows(require("./src/ratings.json"));

function getRows(items) {
  return items
    .filter((m) => m.type === "table")
    .map((m) => m.data)
    .pop();
}

function getUserId(id) {
  return encodeURIComponent(memberships.find((m) => m.userId === id).Email);
}

function getUserName(id) {
  return users.find((m) => m.id === id).name;
}

function getRating(id) {
  const items = ratings.filter((m) => m.levelid === id);

  if (items.length > 0) {
    const sum = items.reduce((prev, curr) => +curr.stars + prev, 0);
    return (sum / items.length).toFixed(2);
  }

  return "0";
}

const reserved = [';', '"', '\n'];

function convertValue(v) {
  if (typeof v === "string" && reserved.some(r => v.includes(r))) {
    return JSON.stringify(v).replaceAll('\\"', '""');
  }

  return v;
}

function exportToCsv(items, file) {
  const columns = Object.keys(items[0]);
  const rows = [
    columns.join(";"),
    ...items.map((item) => Object.values(item).map(convertValue).join(";")),
  ];
  const content = rows.join("\n");
  const target = resolve(__dirname, "dist", file);
  mkdirSync(dirname(target), { recursive: true });
  writeFileSync(target, content, "utf8");
}

const convertedLevels = levels.map((level) => ({
  PartitionKey: getUserId(level.userid),
  RowKey: level.id,
  Name: level.name,
  "Name@type": "String",
  Description: level.description,
  "Description@type": "String",
  Skill: +level.skill,
  "Skill@type": "Int32",
  Background: +level.background,
  "Background@type": "Int32",
  Played: +level.played,
  "Played@type": "Int32",
  Content: level.content,
  "Content@type": "String",
  Created: new Date(level.created).toJSON(),
  "Created@type": "DateTime",
  Updated: new Date(level.updated).toJSON(),
  "Updated@type": "DateTime",
  UserName: getUserName(level.userid),
  "UserName@type": "String",
  Rating: getRating(level.id),
  "Rating@type": "String",
}));

const convertedUsers = users.map((user) => ({
  PartitionKey: "aspnet",
  RowKey: getUserId(user.id),
  LastActivityDate: new Date(user.lastActivityDate).toJSON(),
  "LastActivityDate@type": "DateTime",
  Email: getUserId(user.id),
  "Email@type": "String",
  UserName: getUserName(user.id),
  "UserName@type": "String",
}));

const convertedRatings = ratings.map((rating) => ({
  PartitionKey: getUserId(rating.userid),
  RowKey: rating.levelid,
  Created: new Date(rating.created).toJSON(),
  "Created@type": "DateTime",
  UserName: getUserName(rating.userid),
  "UserName@type": "String",
  Comment: rating.comment,
  "Comment@type": "String",
  Stars: +rating.stars,
  "Stars@type": "Int32",
}));

exportToCsv(convertedLevels, "levels.export.csv");
exportToCsv(convertedUsers, "users.export.csv");
exportToCsv(convertedRatings, "ratings.export.csv");
Enter fullscreen mode Exit fullscreen mode

With these 3 tables in place it's time to rework the authentication.

Authentication

The Azure Static Web App service handles everything for us. All we need to do is to change the code to reflect this. For instance, every request to the Azure Functions will have an extra header that has the currently logged in user (or lack thereof) attached to it. Reading it out in a C# Azure Function can be done using these helpers:

private static bool IsAuthenticated(HttpRequestData req)
{
    var identity = Parse(req);
    return identity.Identity.IsAuthenticated;
}

private static string GetUserId(HttpRequestData req)
{
    var identity = Parse(req);
    return identity.Identity.Name;
}

private static ClaimsPrincipal Parse(HttpRequestData req)
{
    var principal = new ClientPrincipal();

    if (req.Headers.TryGetValues("x-ms-client-principal", out var headers))
    {
        var data = headers.First();
        var decoded = Convert.FromBase64String(data);
        var json = Encoding.UTF8.GetString(decoded);
        principal = JsonSerializer.Deserialize<ClientPrincipal>(json, new JsonSerializerOptions { PropertyNameCaseInsensitive = true });
    }

    principal.UserRoles = principal.UserRoles?.Except(new string[] { "anonymous" }, StringComparer.CurrentCultureIgnoreCase);

    if (!principal.UserRoles?.Any() ?? true)
    {
        return new ClaimsPrincipal();
    }

    var identity = new ClaimsIdentity(principal.IdentityProvider);
    identity.AddClaim(new Claim(ClaimTypes.NameIdentifier, principal.UserId));
    identity.AddClaim(new Claim(ClaimTypes.Name, principal.UserDetails));
    identity.AddClaims(principal.UserRoles.Select(r => new Claim(ClaimTypes.Role, r)));

    return new ClaimsPrincipal(identity);
}
Enter fullscreen mode Exit fullscreen mode

Very often the reason for reading out the user is to retrieve more information, e.g., the given user name. Since the user name is a free field that can be chosen by the logged in user we need to obtain it from the database.

Using Azure Table Storage as a database we can do it as follows:

private static User GetDefaultUser(HttpRequestData req)
{
    var identity = Parse(req);
    return new User
    {
        PartitionKey = identity.Identity.AuthenticationType,
        RowKey = HttpUtility.UrlPathEncode(identity.Identity.Name),
        Name = identity.Identity.Name,
        Email = identity.Identity.Name,
        LastActivityDate = DateTimeOffset.Now,
    };
}

private async Task<User> QueryUser(HttpRequestData req)
{
    var identity = Parse(req);
    var provider = identity.Identity.AuthenticationType;
    var userId = identity.Identity.Name;
    var usersClient = serviceClient.GetTableClient("users");
    var user = await usersClient.GetEntityAsync<User>(provider, HttpUtility.UrlPathEncode(userId));
    return user?.Value ?? GetDefaultUser(req);
}
Enter fullscreen mode Exit fullscreen mode

In the previous code we've used the serviceClient instance, which is the TableServiceClient for the database using the Azure.Data.Tables NuGet package.

Among other possibilities, it could be initialized as follows:

var storageUri = Environment.GetEnvironmentVariable("STORAGE_URI");
var accountName = Environment.GetEnvironmentVariable("STORAGE_NAME");
var storageAccountKey = Environment.GetEnvironmentVariable("STORAGE_KEY");
var url = new Uri(storageUri);
var credentials = new TableSharedKeyCredential(accountName, storageAccountKey);
serviceClient = new TableServiceClient(url, credentials);
Enter fullscreen mode Exit fullscreen mode

Using the authentication information in our SPA part is also possible. Here, we can use the provided /.auth/me endpoint:

async function getUserInfo() {
  const response = await fetch("/.auth/me");
  const payload = await response.json();
  const { clientPrincipal } = payload;
  return clientPrincipal?.userDetails;
}
Enter fullscreen mode Exit fullscreen mode

This endpoint just reflects the header and returns it back to the caller. Very neat and quite useful. So, keep in mind that all HTTP calls are anyway authenticated implicitly by Azure Static Web Apps. There is no token or anything we need to take care of - it's all handled for us.

With this in mind let's dive into the specific conversion of the code.

Static Web App

Alright, let's get to the meat of it. We start by changing the code structure a bit as shown below:

Changes to the code structure

Importantly, we'll move the whole relevant backend code to an api folder while all the frontend related stuff goes into the app folder. In there, we'll distribute it as follows:

  • All static assets go into public
  • All JavaScript and stylesheet code goes into src

In addition we'll add a pipeline definition. We'll go into details on that one in the next section ("Deployment").

Finally, the original code base needs to be transformed. This consists of two parts:

  1. Transform the API / backend code to Azure Functions
  2. Transform the views / frontend code to a SPA

For (1) we'll essentially just copy over the C# code and augment it to go from

// GET: /Level/

public ActionResult Index(string id, int? o)
{
    var model = new RatesModel();
    var ml = (from level in db.Levels
              let ratings = db.Ratings.Where(m => m.LevelId == level.Id)
              select new RateModel
              {
                  Level = level,
                  Sum = (int?)ratings.Sum(m => m.Stars),
                  Total = ratings.Count()
              }).ToList();

    model.Offset = o != null ? Math.Max((int)o, 0) : 0;
    model.Entries = MAX_ENTRIES_PER_PAGE;

    if (id == null)
        id = "latest";

    switch (id)
    {
        case "popular":
            model.AddRange(ml.OrderByDescending(m => m.Level.Played).ToList());
            break;
        case "rated":
            model.AddRange(ml.OrderByDescending(m => m.Rating * Math.Min(m.Total, 100)).ToList());
            break;
        case "easy":
            model.AddRange(ml.OrderBy(m => m.Level.Skill).ToList());
            break;
        case "hard":
            model.AddRange(ml.OrderByDescending(m => m.Level.Skill).ToList());
            break;
        default:
            id = "latest";
            model.AddRange(ml.OrderByDescending(m => m.Level.Updated).ToList());
            break;
    }

    model.SortID = id;
    Session.Add("LastListType", id);
    Session.Add("LastOffset", model.Offset);
    return PartialView(model);
}
Enter fullscreen mode Exit fullscreen mode

to the following:

// GET /api/level
[Function(nameof(GetLevels))]
public async Task<HttpResponseData> GetLevels([HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "level")] HttpRequestData req)
{
    var response = req.CreateResponse(HttpStatusCode.OK);
    var levelsClient = serviceClient.GetTableClient("levels");
    var levels = levelsClient.QueryAsync<Level>();
    var items = new List<LevelEntity>();

    await foreach (var level in levels)
    {
        items.Add(new LevelEntity
        {
            background = level.Background,
            id = level.RowKey,
            name = level.Name,
            description = level.Description,
            rating = level.Rating,
            ratings = level.TotalRatings,
            user = level.UserName,
            skill = level.Skill,
            played = level.Played,
            created = level.Created,
            updated = level.Updated,
        });
    }

    var content = JsonSerializer.Serialize(new { items });
    response.Headers.Add("Content-Type", "application/json");
    response.WriteString(content);
    return response;
}
Enter fullscreen mode Exit fullscreen mode

If you look closely you'll spot that the original code had an optional route parameter ("id") in there, which was used to identify how the levels should be sorted. This one will now be moved to the frontend. While such a functionality for an API could make sense (but rather in a query parameter) we don't have so many items here that we need it at the moment.

Originally, the code contained snippets such as:

var marioStyles = new Bundle("~/Content/mario.css", new CssMinify());
marioStyles.AddFile("~/Content/Game.css");
marioStyles.AddFile("~/Content/Site.css");
marioStyles.AddFile("~/Content/Menu.css");
marioStyles.AddFile("~/Content/Editor.css");
marioStyles.AddFile("~/Content/OpenId.css");
marioStyles.AddFile("~/Content/Smartphone.css");
bundles.Add(marioStyles);
Enter fullscreen mode Exit fullscreen mode

This can now be transformed to a single stylesheet (style/index.css):

@import url('./Game.css');
@import url('./Site.css');
@import url('./Menu.css');
@import url('./Editor.css');
@import url('./OpenId.css');
@import url('./Smartphone.css');
Enter fullscreen mode Exit fullscreen mode

In the end, this will be optimized / taken care of by the bundler (at compile-time instead of at runtime).

The biggest change, however, is the introduction of a SPA using Preact. We'll use Preact as its much more lightweight than React without giving us constraints on how to do a UI. Also we are not really interested in super optimizing this. We should get better performane than the original codebase already - but with a much improved developer experience. This way, we already have big wins.

The entry point of the SPA is shown below. It's rendering the app with help of the preact-router package.

import { render } from "preact";
import { Router } from "preact-router";
import Home from "../pages/home";
// ... more pages
import Layout from "./layout";

const container = document.querySelector("#app");
const App = (
  <Layout>
    <Router>
      <Home path="/" />
      <Profile path="/profile" />
      <Login path="/login" />
      <Logout path="/logout" />
      <Redirect path="/game" to="/game/campaign" />
      <Campaign path="/game/campaign" />
      <Custom path="/game/:id" />
      <Levels path="/level" />
      <LevelLoad path="/level/load" />
      <LevelDetails path="/level/details/:id" />
      <LevelRatings path="/level/ratings/:id" />
      <Editor path="/level/edit" />
    </Router>
  </Layout>
);

render(App, container);
Enter fullscreen mode Exit fullscreen mode

The layout is a simple component defined like this:

export default function ({ children }) {
  const [showControls, setShowControls] = useState(false);
  const [showAbout, setShowAbout] = useState(false);
  const toggleControls = () => setShowControls((c) => !c);
  const toggleAbout = () => setShowAbout((c) => !c);

  return (
    <>
      <div id="toppanel">
        <Menu onShowControls={toggleControls} onShowAbout={toggleAbout} />
      </div>
      {children}
      <Dialog shown={showControls} onToggle={toggleControls}>
        <p>
          <span class="control">&larr;</span>move left
        </p>
        <p>
          <span class="control">&rarr;</span>move right
        </p>
        <p>
          <span class="control">&uarr;</span>jump
        </p>
        <p>
          <span class="control">&darr;</span>duck
        </p>
        <p>
          <span class="control">a</span>sprint + shoot
        </p>
      </Dialog>
      <Dialog shown={showAbout} onToggle={toggleAbout}>
        ...
      </Dialog>
    </>
  );
}
Enter fullscreen mode Exit fullscreen mode

That's already a massive win compared to the previous code, which looked like this:

<div id="toppanel">
<div id="socials">
<div class="shr_class shareaholic-show-on-load"></div>
</div>
<nav id="topnav">
<div class="topmenu topicon" id="soundButton"><img alt="Sound" src="@Url.Content("~/Content/sound_on.png")" data-sound-on="@Strings.MusicOff" data-sound-off="@Strings.MusicOn" /></div>
<div class="topmenu tophead" id="menuButton">@Strings.Menu</div>
<div class="topmenu topicon" id="controlsButton"><img alt="Controls" src="@Url.Content("~/Content/controls.png")" title="@Strings.Controls" /></div>
<div class="topmenu topicon" id="infoButton"><img alt="About" src="@Url.Content("~/Content/info.png")" title="@Strings.About" /></div>
@Html.Partial("_LogOnPartial")
</nav>
</div>
@RenderBody()
<div id="bottompanel">
<nav id="bottomnav">
@Html.Partial("_ButtonsPartial")
</nav>
</div>
Enter fullscreen mode Exit fullscreen mode

Things like the sound image (<img alt="Sound" ... />) which contained some magic attributes (data-sound-on="@Strings.MusicOff") that still required duplication in JS are now fully living in JS without any duplication.

For instance, the sound button is its own component:

function SoundButton() {
  const [sound, setSound] = useState(settings.musicOn);
  const toggleSound = () => setSound((s) => !s);

  useEffect(() => {
    if (settings.musicOn !== sound) {
      settings.musicOn = sound;
      saveSettings();

      if (sound) {
        sounds.playMusic();
      } else {
        sounds.pauseMusic();
      }
    }
  }, [sound]);

  return (
    <div class="topmenu topicon" onClick={toggleSound}>
      <img
        alt="Sound"
        src={sound ? soundOn : soundOff}
        title={sound ? "Turn music off" : "Turn music on"}
      />
    </div>
  );
}
Enter fullscreen mode Exit fullscreen mode

Previously, the standard levels have all been part of one giant bundle. Now, we can lazy load the levels of the campaign:

const definedLevels = [
  () => import("./campaign/00"),
  () => import("./campaign/01"),
  () => import("./campaign/02"),
  () => import("./campaign/03"),
  () => import("./campaign/04"),
  () => import("./campaign/05"),
  () => import("./campaign/06"),
  () => import("./campaign/07"),
  () => import("./campaign/08"),
  () => import("./campaign/09"),
  () => import("./campaign/10"),
  () => import("./campaign/11"),
  () => import("./campaign/12"),
  () => import("./campaign/13"),
  () => import("./campaign/14"),
];
Enter fullscreen mode Exit fullscreen mode

This way, the center piece of the game is as lightweight as it can be - but the whole game is still based on jQuery and its original engine code.

Deployment

For CI/CD we use Azure Pipelines. This way (like in the previous articles) we can just push and "forget" how to deploy the code. The build definition looks as follows:

trigger:
- master

pool:
  vmImage: ubuntu-latest

variables:
- group: deployment-tokens

steps:
- task: AzureStaticWebApp@0
  inputs:
    app_location: '/app'
    output_location: 'dist'
    api_location: '/api'
    azure_static_web_apps_api_token: '$(mario5-token)'
Enter fullscreen mode Exit fullscreen mode

Essentially, this is very similar to our previous definitions; except that we have a build required for both - the API and the static web assets.

Conclusion

It runs - faster, more modern, and far more cost efficient (for the given subdomain no additional costs will occur). The crucial part was to identify a way of providing the content in a mode that fits its purpose best.

Using an Azure Static Web App with a SPA instead of a SSR'ed page for this particular application made sense. We get an improved (and much simplified) authentication, faster startup time, and more reliable code base. Also, with direct pipelines and all that future development also seems feasible (not that it matters).

Right now the dedicated server is already taken down. The migration was finished and I only use this series to write up my steps and experience.

In the next post I'll look into the architecture of my personal homepage - and what I plan for its migration and rewrite.

Top comments (2)

Collapse
 
jangelodev profile image
JoĆ£o Angelo

Florian Rappl, great post !
Thanks for sharing

Collapse
 
florianrappl profile image
Florian Rappl

Thanks JoĆ£o, much appreciated!