DEV Community

Cover image for The Cloud Resume Challenge Meets Southern Japan
Thomas Krone
Thomas Krone

Posted on

The Cloud Resume Challenge Meets Southern Japan

Hone the DevOps Basics While Exploring Kyushu on a Road Trip

Disclaimer: The content of this blog post refers to the features of DigitalOcean as of late October / November 2023.

When I traveled to Japan for the first time in my life this year, I wanted to explore this fascinating culture and at the same time learn something new technical during this six-week vacation. I came across the Cloud Resume Challenge by Forrest Brazeal. Even with more than a decade of experience as an SRE, I thought that this challenge could be a great opportunity to uncover the unknown unknowns and delve into areas of the tech stack that I don't typically get to touch on a regular basis in my everyday work. I chose the DigitalOcean flavor of this challenge simply because I didn't worked with this cloud service provider yet.

Oita: A Journey Ahead

So there my girlfriend and I were, after a 14-hour flight across half of the globe, in an airport hotel near Narita/Tokyo. I had my notebook with me, and since it is quite easy to get data-only SIM cards in Japan, I had mobile connectivity right from the start of this journey. One of the benefits of working with public clouds is that there is usually no need to set up a VPN. This allows you to work from anywhere, including more rural areas like the Oita prefecture in Japan's southern Kyushu, where we were flying to the next day.

In Oita, I signed up to the DigitalOcean's generous 60-days free trial period from the Kaikatsu-Club branch, a common manga cafe in Japan. It took a couple of days to get my account unlocked, because I was asked for the purpose of this account. A mail to the customer service got that sorted out.

With the device, mobile internet, cloud account, and the belly full of Tori Karaage, we hit the road in our Kei-car.

Kagoshima: Balance Coding With Exploring Japan's Hidden Gems

While driving south along the beautiful coastlines of Miyazaki towards Kagoshima, I considered a tentative plan to dedicate around half an hour every morning to the resume challenge. During this time, I would enjoy my first cup of coffee, and that worked out quite well for the first couple of days. I also clicked together the first basic application on DigitalOcean's App Platform and re-discovered Buildpacks, which were originally invented by Heroku and have been adopted by various cloud providers.

I am not much of a front-end developer, so I started with a resume template I found on CodePen and added a basic email sign up form:

<div class="row mt-5 me-5">
  <div class="col p-0">
    <p class="title">Contact</p>
  </div>
</div>
<div class="row">
  <div class="col">
    <form v-on:submit="addMail">
        <label for="email" class="form-label">Leave your mail to get in touch</label>
        <div class="input-group">
            <input type="email" id="email" v-model="form.email" placeholder="name@example.com">
            <button type="submit" class="btn btn-outline-secondary">Submit</button>
        </div>
    </form>
  </div>
</div>
Enter fullscreen mode Exit fullscreen mode

Then I only needed to direct a DNS domain name to the app's record and my static website was available on the internet, as SSL certs are automatically taken care of on DigitalOcean.

Yakushima: Persist Data In MongoDB Through A Basic Python API

From Kagoshima, we headed over to the island of Yakushima with a speed ferry (pretty cool experience in itself). The dense woods of Yakushima inspired the enchanting forest settings in Studio Ghibli's anime Princess Mononoke. Before embarking on a hike to the epic Jomon Sugi, a Japanese cedar tree estimated to be at least 2300 years old, I wanted to set up the MongoDB instance for storing the email subscriber's addresses.

I realized that in order to create a MongoDB instance, I needed to first create the database before adding it to the App Platform app. If you want t add a database resource to an app right away, the only option as of now is PostgreSQL. The docs also state that development-tier databases are only available for PostgreSQL.

I swiftly assembled the back-end code using FastAPI and Pydantic to create the API layer responsible for managing email subscriptions. Then I wanted to add the back-end component via DigitalOcean's web UI and found a first glitch: adding a component to an application does not work via mobile browser. But this is just a minor usability thing and I was able to add the back-end component using my notebook.

Unfortunately, as of the current date, there is no support for connecting to databases over a private VPC connection, so an app has to connect to a managed database via the public internet. I am hopeful that DigitalOcean's team is actively addressing this issue as also other users raised the need for this feature.

Kumamoto: Dockerization, CI/CD and Tests

After recuperating from this great hike to what is likely the oldest tree I will ever encounter in my life, with a rejuvenating hot shower and a restful night's sleep, we journeyed back to the Japanese mainland to resume our exploration, heading towards the north-western city of Kumamoto.

Since my significant other was driving most of the way, I was able to catch up on the latest best practices regarding Poetry in Docker. I was struck by the fact that this topic is still very actively and controversially discussed.

Finally, I came up with this Dockerfile, which was suitable for my simple project:

# Use an official Python builder as a parent image
FROM python:3.9-bookworm as builder

# Set environment variables for MongoDB connection
ENV MONGO_HOST=localhost
ENV MONGO_PORT=27017
ENV MONGO_DB=subscriptions

# Install poetry and dependencies
ENV POETRY_VERSION=1.6.1 \
    POETRY_HOME=/opt/poetry \
    POETRY_NO_INTERACTION=1 \
    POETRY_VIRTUALENVS_CREATE=1 \
    POETRY_VIRTUALENVS_IN_PROJECT=1 \
    POETRY_CACHE_DIR=/tmp/poetry_cache

ENV PATH="$POETRY_HOME/bin:$PATH"

RUN curl -sSL https://install.python-poetry.org | python3 -

# Set the working directory in the container
WORKDIR /app

# Copy the local poetry.lock and pyproject.toml to the container
COPY pyproject.toml poetry.lock /app/

RUN poetry install --without dev --no-root && rm -rf $POETRY_CACHE_DIR

# The runtime image, used to just run the code provided its virtual environment
FROM python:3.9-slim-bookworm as runtime

ENV VIRTUAL_ENV=/app/.venv

ENV PATH="$VIRTUAL_ENV/bin:$PATH"

COPY --from=builder $VIRTUAL_ENV $VIRTUAL_ENV

COPY app.py /app/

WORKDIR /app

# Expose the port your FastAPI application will run on
EXPOSE 8080

# Start the FastAPI application
CMD ["uvicorn", "app:app", "--host", "0.0.0.0", "--port", "8080"]
Enter fullscreen mode Exit fullscreen mode

Next up was to create tests. Regardless of whether you opt for unit or integration tests, the inclusion of tests is non-negotiable. However, given the straightforward nature of this API - a CRUD-like application - it is not a good idea to mock the database as it:

  1. adds (unnecessary) complexity, because you would need to mock a significant part of the DB engine
  2. gives false sense of security, as you might not be able to implement the behavior of the mocked DB completely anyways

With today's containers, you can create a throwaway database for testing purposes quickly and easily.

To automate the test suite, I used GitHub's setup-python and install-poetry actions. Re-deploying the application whenever changes are detected in the reference GitHub repository works quite well, but Autodeploy can not be configured to only react on a certain sub directory. So changes in monorepos might lead to unnecessary redeploys.

The final step was to change the endpoint in the front-end code to actually make request the back-end API:

const formatResume = (r) => ({
  ...r,
  address: [
    r.country,
    r.city,
    r.postalCode
  ].filter(Boolean).join(', ')
})

const config = {
  development: {
    baseUrl: 'http://localhost:8080',
  },
  production: {
    baseUrl: 'https://my-domain.com/api',
  }
};

const currentConfig = window.location.hostname === 'localhost'
  ? config.development
  : config.production;

let data = formatResume(resume)
data.form = { email: '' }
data.responseData = ''

new Vue({
  el: "#app",
  data: data,
  methods: {
    addMail(e) {
      e.preventDefault();
      axios.post(currentConfig.baseUrl + '/subscribe', {
        email: this.form.email
      }).then(response => {
        console.log('Subscribed ' + this.form.email);
      }).catch(error => {
        console.log(error);
      });
    }
  }
});
Enter fullscreen mode Exit fullscreen mode

Fukuoka: More Automation with Infrastructure As Code

After getting to know the incredibly friendly people in Kumamoto's small Izakayas, we headed off again. Now the highway took us to Fukuoka, the vibrant port city at Hakata bay in the north of Kyushu, where we were able to spend the night in a fully automated hostel.

As it is even more important to automate your infrastructure than your accommodation, I was looking into what DigitalOcean can do for me in that regard. I am quite acquainted with Terraform, but the cloud resume challenge asks for AppPlatform App Specification.

Relying solely on configuration-as-code principles and repeatable commands, I utilized doctl to create not only the project itself but also the MongoDB instance, along with instructions for configuring firewall rules (Trusted Sources). Since there is currently no way to pass a project ID to doctl commands, all resources like the MongoDB instance are created first in the default project and then have to be migrated over to the target project (see this issue).

Beppu: Closing the Circle - Back To the Ocean

After a few great days and nights in the vibrant city of Fukuoka, our last stop finally took us back to the north-eastern coast of Kyushu: to the Onsen-capital Beppu. At this point, I was basically finished with the coding challenge and our vacation was slowly coming to an end.

I still wanted to polish my project a bit and did some optimizations and refactoring:

  • fixed some pymongo and Pydantic deprecation warnings
  • separated dependency installation in the Docker container from application code (this should bring down build time, as the dependencies are not installed each and every time the app code is changed)
  • the CI workflow should only run for new and updated PRs, not (again) when the changes are merged with main branch
  • added toast in the front-end to notify about successful subscriptions or failures

As the resume cloud challenge is kind of an open-ended project, this is a non-exhaustive list of possible additions and improvements I could have added:

  • extracting the config object from resume.js into separate file for cleaner code structure
  • adding a secured endpoint for admin access to the database's content
  • caching optimizations for GitHub Actions
  • more extensive form validation and front end tests
  • shorter timeout for establishing a MongoDB connection
  • monitoring and alerting!
  • and much more...

Conclusion

I would like to say that I've finished the cloud resume challenge completely in Japan, but this was not the case. As I am writing this blog post, I am already back in Berlin and am thinking about my travel trough Japan what I took away from the cloud resume challenge.

I found DigitalOcean is a quite nice and simple platform to host your everyday applications, databases and VMs. There are still some rough edges, but with the existing features and the transparent pricing model they provide be a decent start if you just want to have your prototype up and running in the cloud and scale out later.

It was a cool challenge in the end, I discovered some knowledge gaps and learned a bit more about Poetry in Docker, Pydantic, and Vue.js. I sincerely hope this article ages quickly and the few kinks I encountered get fixed and new features become available on DigitalOcean soon because I think it's a pretty cool platform.

Top comments (1)

Collapse
 
gnarlylasagna profile image
Evan Dolatowski

This is great, I had a great time completing the Cloud Resume Challenge using Azure! Thank you for sharing your experience