Hi! If you don't know that Docker can manage physical resources, this is the place to learn about it and keep your production servers running faster.
In my current job, we have a lot of Docker containers running in production, but the problem was that we never considered how much CPU memory it occupies in our servers.
To solve this, in Docker, using docker-compose, we have a section named "deploy" that lets us manage limits and reservations of RAM. For example, here we have a Web Service based on Python and Django which has a Postgres database.
In this docker-compose.yml you can see "deploy" with "resources", that we need to configure for reserve quantity of RAM that can use the container and set a limit that which it can reach
version: '3.8' services: web: build: . volumes: - .:/code ports: - "8081:80" command: "python3 manage.py runserver 0.0.0.0:80" deploy: resources: limits: #Definiendo el límite de acceso a memoria RAM y núcleos CPU cpus: 0.50 memory: 512M reservations: #Reservando espacio de memoria RAM y núcelos del CPU cpus: 0.25 memory: 128M database: image: postgres:14.1-alpine restart: always environment: POSTGRES_DB: dockerized POSTGRES_USER: postgres POSTGRES_PASSWORD: postgres PG_SYSTEM_MAX_CONNECTIONS: 500 ports: - '5432:5432' volumes: - database:/var/lib/postgresql/data volumes: database:
we can see the information of the containers that are running and confirm that everything as we expected:
Top comments (0)