Let’s be honest, software development is hard. Especially when you are working on a product with separate components that are required to be integrated together. Over the years different software development techniques have been developed to streamline the development process and meet the requirements and quality at the same time. However, in recent years as cloud-based technologies have flourished, most of the existing development methods have become somewhat difficult to manage when it comes to developing quick and fast cloud-based software solutions. In today’s blog post we discuss why to learn docker and the technologies and processes that surround it.
Let’s get started…
To give you some context let’s say you are developing a product for a particular customer(a company or client) and you are using different technologies such as client database, static front end, back-end development, virtual machines, production cluster etc. You can see this is a heavy project. Now, imagine you need to upgrade the database server or say you want to switch from MySQL to MongoDB. Well, it doesn’t sound like a huge problem, right? To be honest, it’s not impossible but to be able to make this transition not just the database server but also all the features that are using the database have to be updated. This increases the amount of time required to complete the upgrade and the developers may end up inducing more bugs into the system. So the problem is scalability and software evolution.
Now, Imagine your customer doesn’t want you to set up your own in-house server for this product instead its suppose to be in the cloud. That means this entire product has to be able to work in a remote server(cloud). This gives rise to yet another problem. Now you have to manually install all the dependencies on to the remote server(cloud) before you can deploy the product. And every time a new version or a new feature is added to the product the dependencies in regards to the feature have to be installed onto the server. This gives rise to yet another problem. Now if a feature requires a certain version of the previously installed dependency the previous version has to be replaced with the version that’s required, but now the feature that was using the previously installed version has to be updated otherwise the system will not work as intended. As you can see this requires a lot of work from the developer just to deploy and improve a product. This slows down the development process as the developers have to make sure that the installed package is not broken and take care of the dependencies. So the problem is portability.
Now that you have understood what the problems are let’s try to find out the solution…
Okay, let’s try to find a fix to all the aforementioned problems.
Let’s get portability out of the way first. How do you make something portable? For example, if you have to shift to a new place you rent a container truck, load all your stuff into the truck and then transport it to where ever you want. Well, that’s the most logical thing to do, right? This is exactly how applications are deployed in a cloud-based architecture.
A container is like a virtual box that has all the necessary dependencies installed for a particular product. It’s better than a native virtual box because it can run directly above the existing operating system instead of running an entire operating system over the existing one. A container runs as a process in the existing operating system such that the applications running under it are isolated from the system, that is, the application runs exactly as it is supposed to irrespective of the operating system. If it doesn’t sound like a big deal then have a look at this example. Imagine, Your application needs to run on a cloud server but the code and scripts and dependencies are installed on your local system. So now to deploy this product to a cloud server all the dependencies have to be manually configured in the remote server. But you have decided to try out this new technology called containerisation to deploy your application. So, you write your containerisation script to automatically configure your remote server with all the dependencies and automatically deploy the application for you. Not just that, because a container runs over the existing operating system as a process you can have multiple containers running on the same server without worrying about dependency conflicts we talked about earlier. How neat, right? Well, that’s exactly what a container is for. It raps up all the scripts and it’s dependencies into a container such that it will always run as intended irrespective of the operating system and environment.
Now, we understood how the portability problem is solved using containers but what about scalability? And software evolution?
Again, let’s understand this with the help of an example. As a developer at a cloud-based company, you successfully deployed a product which is now gaining traffic. Your company wants you to scale this product up. Meaning, increase its capacity to handle the traffic. In the case of a product deployed on the in-house server the entire server would have to be reconfigured(hardware/software-wise) but in a cloud-based architecture which is using a containerization service all you have to do it run multiple instances of your image, do some software configuration(load balancing etc) and you have successfully scaled up your product. It’s more complex than I’m making it sound but at a surface level this is what is happing. This solves the problem of scalability.
After you scale up the product you realise that the customer requirements have changed, meaning, some new features are supposed to replace the older ones(for example, make login more streamlined). Well, guess what you have to do? Develop a more streamlined method to login deploy it as a container in the cloud and use restful API to communicate with the service. See, your product just evolved based upon customer requirement. This solves the problem of software evolution. This is just one method to solve this problem. In fact, some of you might argue that this is not the right way to do it. I could agree with you but then this blog post is to explain the benefits of containers and docker in simple terms so I think it doesn’t need to be absolutely precise about the implementation of containers.
Because you are already reading this blog post I assume you already know what Docker is used for. However, if you are one of those people who came across this post by accident and have no clue what Docker is, well don’t worry its not another complex jargon you need to understand before you can dive into containerisation. Let me give you the internet description of Docker. It’s an open-source project which is used to write container images that can be easily deployed on the vast array of web service providers.
So now that you understand why containerisation and Docker are so important for the current industry you are ready to go out and explore it on your own. Thanks for reading today’s blog post. I’ll see you next time. 😉