DEV Community

Cover image for Third month in DevOps
monotiller
monotiller

Posted on

Third month in DevOps

My third and final month in training comes to a close. We focussed a lot on configuration and orchestration tools as well as a final project to bring everything together.

Apologies

First off I must apologise for this post being late, since finishing the course I have had quite a bit of paperwork to be filling out and meetings about what comes next. I can't reveal any details as of yet but it's definitely taking me in a direction I was not expecting, but it's a good one to be sure!

Week 9 - Ansible

We started with an introduction to Ansible. Having had an introduction in the previous weeks to Vagrant the addition of Ansible was a nice compliment.

Ansible allows for you to very easily automate the configurations of virtual machines, you can easily group virtual machines together and run batch operations across several at once.

A good example might be you have 6 virtual machines, 3 are running an app and 3 are hosting a database that the app will pull from. Well you can group them like so:

App Database
192.168.1.1 192.168.1.4
192.168.1.2 192.168.1.5
192.168.1.3 192.168.1.6

And in a hosts file that would look something like:

[app]
192.168.1.1 ansible_connection=ssh ansible_ssh_user=vagrant ansible_ssh_pass=vagrant
192.168.1.2 ansible_connection=ssh ansible_ssh_user=vagrant ansible_ssh_pass=vagrant
192.168.1.3 ansible_connection=ssh ansible_ssh_user=vagrant ansible_ssh_pass=vagrant

[database]
192.168.1.4 ansible_connection=ssh ansible_ssh_user=vagrant ansible_ssh_pass=vagrant
192.168.1.5 ansible_connection=ssh ansible_ssh_user=vagrant ansible_ssh_pass=vagrant
192.168.1.6 ansible_connection=ssh ansible_ssh_user=vagrant ansible_ssh_pass=vagrant
Enter fullscreen mode Exit fullscreen mode

Edit: As mentioned over on LinkedIn don't store your Ansible passwords like I did here, use some way to obscure them a little bit. I used the Ansible vault

Then, let's say that you want to install the software you want to run your app and the app itself on all the virtual machines in the app category, well you would just need to write a configuration file (here's one, but note that app is called web here) and then you can just run this configuration file and automatically carry out the steps on your app virtual machines.

The great thing is that it not only can scale up as needed but it's also agent-less meaning that it can run basically anywhere since it only needs to be started on one machine to talk to many.

I actually really enjoyed using Ansible for orchestration and I think the yaml structure used for configuration is something that personally really clicks with me. I love that you are able to break down the configuration into tasks so someone who may not be so familiar with the tools is able to follow along with what's happening.

Learning Ansible actually pushed me towards learning GitHub actions for use in my own personal projects. I mean, it's built in to the platform that I use to host my code and it costs zero additional pounds so why not eh? Plus it allows me to show off the above feature of Ansible quite nicely:

image

As you can see it shows the name of each step but if the user wants to then they can specify to get a more verbose output. In a command line interface then that is usually done through commands instead.

The rest of the week was spent bringing together the use of Vagrant and Amazon Web Services with ansible to test out hybrid cloud solutions as well as presenting our findings in the form of presentation

Week 10 - Terraform & Docker

Terraform

Wow, one tool to quickly set up virtual machine images and instances in one package as well as deal with VPC configuration? Say it ain't so! But it is!

Terraform has been a good tool although we were somewhat limited in what we could use it on. The company didn't want us all spinning up multiple EC2 instances each and I can understand that so we basically used it to set up one EC2. But once we know how to do it once, we can do it again.

Docker

I love Docker. Package up an app as an image, upload it somewhere and just ask someone to run it. Definitely takes all the guesswork out of getting something set up.

If you'd like a demo app of mine, feel free to view the Docker image on either Docker Hub or GitHub Packages. This will launch the Job Search project my team and I worked on in the sixth week. My updated version including the GitHub actions can be found here:

Job Search Project

Now with Docker support

Briefing

The breifing was to build on top of our previous and to include a data scraping tool that would scour a website (in this case ITJobsWatch) and to show the top 50 job roles on the website.

How to setup the website

Docker

The easiest way to view the website would be through Docker! Simply run docker run monotiller/eng89_jcp and navigate to http://127.0.0.1:5000/

Manual setup

  1. Clone the repository (or download it as a zip and extract)
  2. Make sure Python 3.9.5 or above and the pip package manager is installed
  3. Move into the folder that you put the project
  4. Run pip install requirements.txt to install the required packages to run the site
  5. Run export FLASK_APP=main.py
  6. Run flask run
  7. Head to http://127.0.0.1:5000/

Features

Role page

The role page is capable of searching through and sorting all available roles that have been scraped, it's up…

Suffice to say that this is my new favourite tool. I know I said that earlier, but this is my NEW new favourite tool. It's not the quickest out there, but it is by far the easiest way of getting things to places they need to be without overhead.

The greatest part of Docker for me is that it doesn't require the setup of virtual machines, so although you are technically running somewhat a virtual environment inside your host operating system, you're not having to worry about running a full guest OS on top.

I as a developer can just specify exactly what I need and when a user goes to run the Docker image it'll download only exactly what I wanted it to download. Let's look at the Dockerfile for the job search app:

FROM python
WORKDIR /usr/src/project
COPY . .
RUN apt-get update && apt-get install -f -y
RUN apt-get install python3-pip -y
RUN pip3 install -r requirements.txt
EXPOSE 5000
RUN export FLASK_APP=main.py
CMD [ "python3", "-m", "flask", "run", "--host=0.0.0.0" ]
Enter fullscreen mode Exit fullscreen mode

Taking a look step by step:

  1. We're asking to download the Python image from Docker Hub, this by default comes with a basic Debian Linux installation and some common packages. I could've specified a Python version here but since I am working with the latest version of Python I thought it made sense to line these up. As an aside, when we were doing our final project we did end up switching to the python:slim images instead as we found that not only were they faster to load but came with everything we needed anyway.
  2. We set our working directory
  3. We make sure our packages are up to date and install any missing packages. I know this step isn't required, but once I had an error and it was a missing package, not sure why that happened but as a sanity check, I'll keep it in, for now
  4. We install the pip package manager since we're going to want this to install our requirements.txt file which holds all of the things we will need to install to run the app
  5. We install the following:
    • flask
    • flask_wtf
    • passlib
    • requests
    • pandas==1.3.2
    • flask_table
    • list_function
    • lxml
  6. We export port 5000, since this is the default flask port
  7. We create the environment variable which will point to the file that contains the instructions that run our flask app
  8. We ask to run the flask app on 0.0.0.0

There we go, we can simply navigate to localhost:5000 and the app will be there. All that from simply typing in docker run. We can customise as well, so if we want an app mapped to a certain port or whatever we can do that too.

Anyway, that's enough gushing over Docker, we did briefly touch on compose, but rather than focussing on that now we decided to move on to:

Week 11 - Kubernetes

Kubernetes (K8s) is an orchestration tool. Not too dissimilar to ansible in function but is designed to work at much larger scales. It also has tools for things like disaster recovery so if your Docker containers go down for whatever reason fresh ones can automatically replace those broken ones and horizontal scaling so if you need more instances you don't have to worry about it, you can set an upper limit and K8s will automatically create new instances for you.

Basically, as long as you have the computing power available, K8s will try its hardest to make sure your apps don't go offline. Plus it also can work with hybrid deployment solutions too so you could run on your own hardware until you need more power when you start spinning up containers on the cloud instead.

We didn't get much of an opportunity to really test the damage control out, but we did set a max of 3 containers up, intentionally kill one off and watch as K8s had a replacement in seconds. Cool technology honestly and I was so blown away at the speed at which it just got a new one up and running.

Week 12 - Final Project

Last but by no means least we were set our final project. The task was for the entire teaching group (all 15 of us) to take a website and database combination and automate the process of testing and deploying the app on AWS EC2.

The app chosen was my group's previous job search website. The updated version can be found here:

GitHub logo engineering89-final-project / jcpp

Presenting JobCentre++. The frontend of the final project

Engineering 89 Final Project

Now with Docker support

Reviving the site

Breifing

The breifing was to build on top of our previous and to include a data scraping tool that would scour a website (in this case ITJobsWatch) and to show the top 50 job roles on the website.

How to setup the website

Docker

The easiest way to view the website would be through Docker! Simply run docker run monotiller/eng89_jcp and navigate to http://127.0.0.1:5000/

Manual setup

  1. Clone the repository (or download it as a zip and extract)
  2. Make sure Python 3.9.5 or above and the pip package manager is installed
  3. Move into the folder that you put the project
  4. Run pip install requirements.txt to install the required packages to run the site
  5. Run export FLASK_APP=main.py
  6. Run flask run
  7. Head to http://127.0.0.1:5000/

Features

Role page

The role page is capable of searching through and sorting all available roles that have…

We split ourselves into four teams: Automation, Backend, Frontend and Testing

Jenkins was the central part of our project. As soon as a push was made to GitHub, a web hook would inform Jenkins. First some tests were run, a test instance was spun up on EC2 which would run through the pytest units. We tested database connections as well as the ability to read and write from/to the database. The instance would also be terminated after successful testing

If those tests passed then a Docker image was created from the GitHub repository which would then be pushed to Docker hub and an email alert sent out to those on the team. Then another EC2 instance would be spun up and the Docker image would be run with the finished website or if an EC2 instance was already running then the new Docker image would just replace the old one.

AWS S3 was used to host the database files so the app could be independently updated from the information it was hosting. This was perfect since data recovery was now easier plus we don't have to worry about making sure all copies of the database were up to date and it left us room for caching too should we need it in the future!

AWS CloudWatch was used to monitor the usage of resources on the instances as well as collect log files. We set up some triggers here as well to react to changes in AWS resources. Paired with CloudWatch was Gatling which we used to do some load testing, we ran it a few times and found on average we could sustain about 400 loads per second before there was any significant slowdown (on a t2.micro service).

From start to finish the entire process took just over 4 minutes from cold which we were happy with. We could've streamlined the process a little bit here and there but to go live to the world in that short amount of time was incredible.

Really proud of the group as a whole, we managed to pull together all the knowledge of the past 11 weeks and do all the DevOps things we needed to get an app up and running for the world to see!

Epilogue

Looking back over these three months of training it has become clear to me how much of a well oiled machine DevOps is from an outsiders perspective. There is an abundance of tools available too us to do pretty much whatever we need whenever we want to (or not in the case of automation).

Working with scales is very different to what I'm used to in Electronic and Computer Systems Engineering, but I found it clicking really easily. I'm glad I've made the change to DevOps, it's been really rewarding learning all these technologies and I can't wait to see what's coming up next. But unfortunately for now that's a story for another day.

What's next?

I don't really know. As much as I'd like to keep writing these articles, there is a problem when it comes to divulging sensitive information and I don't want these to become repetitive so I might slow the frequency down, perhaps bi-monthly, even semi-yearly. But I guess I could also just write about new findings whenever I come across something new!

Either way, thank you for joining me on this journey and I hope to see you again soon!

Discussion (2)

Collapse
lucasalustiano profile image
Lucas Salustiano

Nice to see your achievements, I'm happy to see you've finished your project. For the next steps, what about write some tutorials based upon what you've learned with a more deep dive into some concepts? I think it would be great for beginners in the DevOps role.

I wish the best for you in your carrier. Good luck.

Collapse
monotiller profile image
monotiller Author

Thank you for reading! I will think about doing tutorials but I’m not very experienced in doing so. But I’ll definitely give it some thought!