DEV Community

Dang Hoang Nhu Nguyen
Dang Hoang Nhu Nguyen

Posted on

[BTY] Day 7 & 8: Use NVIDIA Docker Containers to deploy ML models

As you may know, The host’s NVIDIA GPU does not expose to your containers by default. So, you have to work around a little bit to make your models interface with your GPU. Eventually, the process of deploying your machine learning frameworks to production will be easier and more effective.

Here are the main articles I have passed through when I tried to dockerize my app with GPU support. In that journey, I encountered many troubles that were not mentioned in those articles. I searched the solution for each problem from many discussion threads I had found on the internet. It mainly depends on your architecture, your ML frameworks, your host machine, ... I will release an official blog soon.

  1. How to Use the GPU within a Docker Container: link
  2. How to Use an NVIDIA GPU with Docker Containers: link
  3. Complete guide to building a Docker Image serving a Machine learning system in Production: link
  4. CUDA + Docker = ❤️ for Deep Learning: link

Deployment Environment:

  • Ubuntu: 20.04
  • Graphic card: GTX 3090
  • Python: 3.7
  • Torch: 1.8
  • CUDA: 11.1
  • CUDNN: 8.2.1

Top comments (0)