DEV Community

Cover image for How to Reduce Docker Image Size: Best Practices and Tips for DevOps Engineers

How to Reduce Docker Image Size: Best Practices and Tips for DevOps Engineers

Table of Contents

  1. Why Reducing Docker Image Size is Important
  2. Start with a Minimal Base Image
  3. Multistage Builds
  4. Avoid Installing Unnecessary Dependencies
  5. Use .dockerignore to Exclude Unnecessary Files
  6. Optimize Layers in the Dockerfile
  7. Clean Up After Installing Packages
  8. Use Smaller Language Runtimes
  9. Compress Image Layers
  10. Remove Debug Information
  11. Regularly Audit Your Images
  12. Advanced Tips
  13. Conclusion

In the world of DevOps, optimizing Docker images is crucial for efficient deployment and orchestration of applications. Reducing the size of your Docker images can improve speed, minimize storage costs, and streamline CI/CD pipelines. This comprehensive guide will walk you through the best practices for reducing Docker image size, along with tips and strategies to help you create lean, efficient images.

Why Reducing Docker Image Size is Important

  • Faster Builds: Smaller images result in faster build times and quicker deployments.
  • Reduced Bandwidth and Storage Costs: Large images take longer to transfer across networks and require more storage, which can become expensive.
  • Faster Container Start Time: Smaller images lead to quicker container startups, which is crucial in dynamic environments where containers need to scale rapidly.
  • Improved Security: Reducing the image size minimizes attack vectors by limiting unnecessary software and dependencies that could be vulnerable.

1. Start with a Minimal Base Image

The base image serves as the foundation for your Docker image. Choosing a lightweight base image can drastically reduce the overall size of your image. Consider the following base images:

  • Alpine Linux: One of the most popular choices for minimal Docker images, Alpine Linux is around 5MB in size compared to Ubuntu’s 200MB. It’s designed for simplicity and security, but be aware that using Alpine may require additional work for compiling certain dependencies.

Example:

  FROM alpine:3.18
Enter fullscreen mode Exit fullscreen mode
  • Distroless: Google’s Distroless images are another great option for minimal containers. These images don’t include an operating system shell and are purpose-built for running applications securely.

Example:

  FROM gcr.io/distroless/base
Enter fullscreen mode Exit fullscreen mode

2. Multistage Builds

Multistage builds allow you to use multiple FROM instructions in your Dockerfile, effectively breaking down your build process into stages. This is especially useful for compiling code and only copying the final artifacts to the production image, leaving behind unnecessary dependencies.

Example of a Multistage Build:

# Stage 1: Build
FROM golang:1.19 AS builder
WORKDIR /app
COPY . .
RUN go build -o main .

# Stage 2: Production
FROM alpine:3.18
WORKDIR /app
COPY --from=builder /app/main /app/
CMD ["./main"]
Enter fullscreen mode Exit fullscreen mode

In this example, the build dependencies (e.g., Golang and source code) are only present in the first stage. The final image contains only the compiled binary and a minimal Alpine base, resulting in a much smaller image.

3. Avoid Installing Unnecessary Dependencies

When installing packages or libraries, only include what is necessary for your application to run. Avoid installing development dependencies in your final image. You can use tools like --no-install-recommends when working with apt-get in Debian-based images to avoid extra packages.

Example:

RUN apt-get update && apt-get install --no-install-recommends -y \
    curl \
    ca-certificates \
    && rm -rf /var/lib/apt/lists/*
Enter fullscreen mode Exit fullscreen mode

This approach prevents installing recommended but unnecessary packages, reducing the image size.

4. Use .dockerignore to Exclude Unnecessary Files

Similar to .gitignore, the .dockerignore file helps exclude unnecessary files and directories from your Docker build context, preventing them from being copied into your image.

Example .dockerignore File:

node_modules
.git
.env
tmp/
logs/
Enter fullscreen mode Exit fullscreen mode

By excluding these files, you can significantly reduce the size of your image and speed up the build process.

5. Optimize Layers in the Dockerfile

Each line in your Dockerfile creates a new layer in the final image. To minimize image size, combine multiple commands into a single RUN instruction when possible. This helps avoid the accumulation of unused files in intermediate layers.

Example Before Optimization:

RUN apt-get update
RUN apt-get install -y python3
RUN apt-get clean
Enter fullscreen mode Exit fullscreen mode

Example After Optimization:

RUN apt-get update && apt-get install -y python3 && apt-get clean
Enter fullscreen mode Exit fullscreen mode

By combining these commands, you reduce the number of layers and eliminate temporary files that would otherwise be cached.

6. Clean Up After Installing Packages

During image builds, temporary files like cache or logs are often created, which can inflate the image size. Always clean up package manager caches and other temporary files after installing software.

For Debian-based Images:

RUN apt-get update && apt-get install -y python3 && apt-get clean && rm -rf /var/lib/apt/lists/*
Enter fullscreen mode Exit fullscreen mode

For Alpine-based Images:

RUN apk add --no-cache python3
Enter fullscreen mode Exit fullscreen mode

Using --no-cache with apk ensures no temporary cache files are created, keeping the image size minimal.

7. Use Smaller Language Runtimes

If your application is written in a language like Python, Node.js, or Java, consider using smaller runtime images. Many languages offer “slim” or “alpine” versions of their runtimes.

Example:

# Instead of using this:
FROM python:3.11

# Use the slim version:
FROM python:3.11-slim
Enter fullscreen mode Exit fullscreen mode

These slim versions remove unnecessary components while still providing the core functionality of the language runtime.

8. Compress Image Layers

Docker automatically compresses image layers during the build process. However, you can further optimize this by using compression tools manually. For example, when installing packages or binaries, you can leverage compression tools like gzip or tar to minimize the file size before copying them to the final image.

9. Remove Debug Information

If your application includes debugging symbols or metadata, it’s often unnecessary for production environments. Stripping this data can save space.

Example:

RUN strip /path/to/binary
Enter fullscreen mode Exit fullscreen mode

10. Regularly Audit Your Images

Over time, your images can bloat due to outdated dependencies or unused software. Use tools like docker image ls and docker image prune to regularly audit and clean up old images.

You can also use Docker’s built-in --squash flag to combine all layers into a single one, reducing size, though it’s currently experimental.

Pruning Unused Images:

docker image prune -f
Enter fullscreen mode Exit fullscreen mode

Advanced Tips

11. Use Docker Image Scanning

Tools like Docker Scout or third-party services (e.g., Trivy or Clair) can analyze your Docker images for vulnerabilities and outdated packages. These tools often provide recommendations to reduce unnecessary libraries and dependencies.

12. Use OverlayFS and Shared Layers

In Kubernetes or other orchestrated environments, you can leverage shared layers across images using OverlayFS. This file system allows Docker to store only the differences between container layers, reducing the total size on disk.

13. Consider Unikernels

If extreme size optimization is needed, explore the use of unikernels. These are single-purpose, lightweight virtual machines that package only the application and its minimal required OS components. They’re much smaller than traditional Docker containers, though they are more complex to implement.


Conclusion

Optimizing Docker image size is a crucial aspect of maintaining efficient and scalable containerized environments. By starting with a minimal base image, leveraging multistage builds, and cleaning up unnecessary files, you can drastically reduce your image size. Following these best practices not only improves the performance of your deployments but also enhances security and reduces costs.

By regularly auditing and refining your Docker images, you ensure that your containers are lean, secure, and production-ready. These steps will save bandwidth, reduce startup times, and provide a more efficient development workflow for your DevOps pipelines.


By applying these techniques, you can help DevOps engineers create optimized Docker images, improving the overall efficiency of your applications. Smaller images are not just faster to deploy; they also contribute to better security, reliability, and cost-efficiency in cloud-native environments.

👤 Author

banner

Join Our Telegram Community || Follow me on GitHub for more DevOps content!

Top comments (19)

Collapse
 
little_twinkle_0ae2b15172 profile image
little twinkle

Good optimisation content.Thanks for sharing. Clear content with immense knowledge.

Is above technique can be used in production deployment as well? Prune command not recommended to use in higher level environment.

Collapse
 
notharshhaa profile image
H A R S H H A A • Edited

Thank you so much! 😊 @little_twinkle_0ae2b15172
I'm glad you found the optimization techniques helpful! The strategies outlined can absolutely be used in production environments. However, you're right—using the prune command should be done with caution in higher-level environments.

In production, it's best to carefully manage image and layer cleanup to avoid unintended deletions. Instead of using docker system prune broadly, you might want to implement specific cleanup practices, such as removing unused images and layers only when certain they are no longer needed.

Thanks again for your thoughtful feedback! 🙌

Collapse
 
david_j_eddy profile image
David J Eddy

Replace Docker with containers or padman or any other cli tool please. Docker Inc is constantly becoming more and more of a hostile organization to the OSS community and the masses of engineers who supported it in the early days.

All thw above still holds true for any containerized workload. Just don't use the Docker cli/desktop.

Collapse
 
notharshhaa profile image
H A R S H H A A

Thank you for your input! 🙏 @david_j_eddy
You make a valid point about Docker's relationship with the OSS community, and I agree that the best practices mentioned in the article apply to any containerized workload, whether you're using Docker, Podman, or another CLI tool.

I'll definitely consider adding a section on alternatives like Podman and how the same strategies can be applied across different containerization platforms. My goal is to help DevOps engineers optimize their containers regardless of the specific tool they choose.

Thanks again for your valuable feedback! 😊

Collapse
 
paulsanjay81 profile image
Sanjay Paul

Great post! Reducing Docker image size is such an essential topic for DevOps teams, and you've laid out some fantastic strategies here. I particularly love the emphasis on multistage builds and using minimal base images—those alone can lead to significant improvements in efficiency and security.

The reminder to regularly audit images is also crucial; it’s easy to let things bloat over time. Tools like Trivy for scanning are game changers in maintaining lean images while keeping an eye on vulnerabilities.

Thanks for sharing these best practices! I’ll definitely be implementing some of these tips in my next project. Keep up the awesome work!

Collapse
 
notharshhaa profile image
H A R S H H A A • Edited

Thank you so much for the kind words! 😊 @paulsanjay81
I'm really glad the post resonated with you and that you found the strategies useful. Multistage builds and minimal base images can indeed make a huge difference in optimizing Docker images, both in terms of efficiency and security. Regular audits with tools like Trivy are definitely game-changers, and it's great to hear that you'll be implementing some of these tips in your next project!

Thanks again for your feedback, and I’ll keep sharing more DevOps insights. Best of luck with your projects! 🙌

Collapse
 
kcq profile image
Kyle Quest • Edited

There's also DockerSlim (aka MinToolkit , aka SlimToolkit), which automatically shrinks container images for you. The most basic way to use is "mint slim nginx:latest" where "nginx:latest" is the container image you want to shrink.

Lots of examples here: github.com/mintoolkit/examples Take a look at the 3rdparty directory there... It has a lot more examples including FastAPI examples and Spring and Micronaut examples.

Collapse
 
notharshhaa profile image
H A R S H H A A

Thank you for sharing this! 🙌 @kcq
DockerSlim (MinToolkit/SlimToolkit) is indeed an excellent tool for automatically shrinking container images, and I appreciate you mentioning it! It's a great addition for those looking to automate and further optimize their images without manual intervention. I'll definitely take a look at the examples in the repository you shared—it's especially useful to see examples for FastAPI, Spring, and Micronaut.

I'll consider adding a section in the article to highlight DockerSlim as another powerful optimization tool. Thanks again for contributing such valuable information! 🚀

Collapse
 
notharshhaa profile image
H A R S H H A A • Edited

Image description

🐳 𝗥𝗲𝗱𝘂𝗰𝗶𝗻𝗴 𝗗𝗼𝗰𝗸𝗲𝗿 𝗜𝗺𝗮𝗴𝗲 𝗦𝗶𝘇𝗲 – 𝗙𝗿𝗼𝗺 𝟭.𝟱𝗚𝗕 𝘁𝗼 𝗝𝘂𝘀𝘁 𝟱𝟱𝗠𝗕 🐳

Building efficient and lightweight Docker images is key to faster deployments and better resource management. In my latest project, I successfully reduced a Docker image from a massive 1.5GB to just 55MB! This optimization can drastically improve your CI/CD pipelines, reduce storage costs, and speed up deployment times.

Collapse
 
mrzaizai2k profile image
Mai Chi Bao

Wow My Docker for AI project is usually so big.

Collapse
 
notharshhaa profile image
H A R S H H A A

Thanks for your comment! 😊 @mrzaizai2k
AI projects, especially with all the libraries and dependencies involved, can indeed lead to some pretty large Docker images! The tips in this article, like using minimal base images, multistage builds, and cleaning up unused dependencies, can help reduce those image sizes. It’s definitely worth a try for AI projects as well!

Feel free to reach out if you need any more specific advice on optimizing your Docker setup for AI workloads. Glad you found the article useful! 🙌

Collapse
 
frankzonarix profile image
Frank Müller

Great article. Saved me some time!

Collapse
 
notharshhaa profile image
H A R S H H A A

Thanks mate @frankzonarix

Collapse
 
anil_mohite_2ac976aaab00f profile image
Anil Mohite

👍 Thanks for sharing

Collapse
 
notharshhaa profile image
H A R S H H A A
Collapse
 
raulpenate profile image
The Eagle 🦅

Thanks for the info dawg!

Collapse
 
notharshhaa profile image
H A R S H H A A

Thanks 👍 mate @raulpenate

Collapse
 
usman_awan profile image
USMAN AWAN

Nice 👍

Collapse
 
notharshhaa profile image
H A R S H H A A

Thanks 👍 @usman_awan