DEV Community

Nicolas El Khoury for AWS Community Builders

Posted on • Updated on

Proposed Infrastructure Setup on AWS for a Microservices Architecture (1)

Chapter 1: Introduction and Design Considerations.

Traditionally, applications were designed and implemented using a Monolithic architectural style, with which the application was developed and deployed as a single component, divided into multiple modules. Monolithic applications are very easy to develop and deploy.

However, such an architectural pattern becomes a burden once the application becomes too large:

  1. Difficult to manage and maintain, due to the large code.
  2. All of the application is built using one programming language, thus the system may suffer from bottlenecks when performing tasks not suitable for this specific language.
  3. Difficult to scale the application.
  4. Difficult to use container based technologies (Due to the large size of the application).

With the emergence of Cloud Computing, and the concept of the on-demand provisioning of resources, a more suitable architectural pattern was required. Microservices rapidly gained popularity, and became a widely used architectural pattern, especially for applications deployed on the cloud. Microservcies are an architectural pattern which divides an application into smaller, independent, loosely coupled services that may communicate with each other via multiple protocols (e.g., HTTP, sockets, events, etc). Microservices provide the following advantages:

  1. Easy to maintain (smaller code in each service).
  2. Highly scalable.
  3. Extremely suitable for container-based technologies. Complements cloud solutions.
  4. Fault tolerance: If one microservice fails, the rest of the system remains functional.

Truly, the Microservices Architecture is a very powerful architectural pattern that goes hand in hand with the services provided by the cloud. However, a well designed system depends on two factors. A robust design of the software, and of the underlying infrastructure. There exists multiple articles, tutorials, courses, that explain and promote the design, and implementation of Microservices. What follows is a detailed description of the points that should be considered when setting up an infrastructure on the cloud, in order to host an application built using the Microservices Architecture.

Indeed, an application is as robust as its underlying infrastructure. Therefore, when designing an infrastructure on the cloud, one must concentrate on achieving the following characteristics:

  • Security: One of the most integral parts in any application is security. A robust software is one that prohibits cyber attacks, such as SQL Injection Attacks, Password Attacks, Cross Site Scripting Attacks, etc. Integrating security mechanisms in the code is a mandatory practice to ensure the safety of the system in general, especially the data layer. However, implementing security in the code is not enough; achieving a secure infrastructure is of utmost importance. An example of a bad infrastructure setup would be to deploy a database in a public subnet, and secure it with a password only. A simple brute force attack is capable of breaching it, no matter how secure and strong the password is. However, deploying the database in a private subnet, with no public access to it will mitigate any sort of direct attack on the database. Hiding the database (Or any other component) from public access is an excellent security mechanism: The database simply does not exist for anyone outside the private network.

  • Availability: refers to the probability that a system is running as required, when required, during the time it is supposed to be running. When deploying the code on the cloud, one has no access or information on the location or type of servers used, and no control whatsoever on the underlying hardware. Moreover, one can never predict and mitigate failures in neither the software nor the hardware. Therefore, a good practice to achieve availability would be to have more than one replica of the system (e.g., replicate the microservices) on more than one machine, and to spawn machines in multiple distinct datacenters.

  • Scalability: Prior to the existence of the cloud, an entity would buy a fixed amount of computing resources, set them up, and deploy the services. One of the biggest disadvantages was the lack of elasticity. This approach suffers from inefficient resource usage: one might end up not using all the resources available (over-provisioning), or requiring more resources to operate (under-provisioning). The on-demand provisioning mechanism offered by the cloud alleviates the aforementioned drawbacks by allowing its users to scale-in and scale-out resources based on the varying load. Therefore, a robust is one that is scalable. Powerful auto-scaling mechanisms ensure the automatic provisioning of “enough” resources, while minimizing the costs of the system, and without any manual intervention. An example of auto-scaling policies would be to increase the number of replicas of one microservice, should the average CPU utilization exceeds 70% for a period of one minute, and to decrease the number of replicas when the CPU utilizations averages 25% for a period of one minute.

  • Performance: Evidently, the application’s performance depends on multiple factors. The Microservices architecture offers the flexibility of writing different parts of the system with different programming languages, depending on the Microservice’s task. For instance, NodeJs is a wonderful platform for IO intensive applications, whereas GoLang is more suitable for CPU intensive ones. In addition to choosing the best platform to develop your application, the quality of the code highly affects the performance. Nevertheless, optimizing the code, and choosing the correct platforms will not guarantee an optimized application performance, should the underlying infrastructure not be equipped enough to properly host the application. For instance, Deploying a database on a memory optimized machine will perform much better than deploying it on a regular machine. Another example of performance degradation would be to host Microservices on machines with not enough computing power. Lastly, the the geographical region in which the application is hosted will also affect the system’s performance. Consider an application deployed in a North American region, and its users are all based in India. Clearly, the users may suffer from latency due to the distance between them and the application.

  • System Visibility: One of the most important mechanisms required to achieve a robust system, and the aforementioned four points is system visibility: Logging, Tracing, and Monitoring. Aggregating the applications logs and displaying them in an organized fashion allows the developers to test, debug, and enhance the application. Tracing the requests is another important practice, allowing to tail every request flowing in and out of the system and rapidly finding and fixing errors and bottlenecks. For instance, consider a request that has to traverse 4 Microservices before returning a response to the user. With proper tracing mechanisms, in case of an error, the developers should be easily able to identify which Microservice caused the error, the reasons behind it, and thus decrease the time needed to fix the issue. In addition to catching errors, tracing allows the developers to find bottlenecks in the system by measuring the time spent by the request in each part of the system. Lastly, it is essential to have accurate and reliable monitoring mechanisms on every aspect of the system. Key metrics that must be monitored include but are not limited to: CPU utilization, Memory Utilization, Disk Read/Write Operations, Disk space, etc. Briefly, maximizing the system visibility is essential in order to increase the robustness of the system.

In summary, this article listed some of the most important concerns that need to be addressed when designing an infrastructure on the cloud for Microservices. The following articles will propose and describe a conceptual architectural design for hosting a Microservices application on AWS, and discuss in details every part of that infrastructure.

List of articles

Top comments (0)