I know multi-stage builds and I'm even using them to avoid secrets in Docker images. But a multi-stage build doesn't improve your build time ;-) With a multi-stage build, you can improve the size of a Docker image, but not the build time. And All the packages I'm adding to the base image I need in production. I can not leave out libxml for example, because then the HTML rendering would not work.
The method I described is really useful to improve the daily docker build time. Of course, you can combine it with docker multi-stage builds. That's something I'm doing as well. Will write another blog post about that topic.
If you run your builds on a CI server, all the dependencies, in all docker stages, have to be installed on each build. Right? But if you refer to a base image that has already all the dependencies, it saves time because it only needs to be downloaded. Correct me if I'm wrong. Always willing to improve my knowledge :)
Ah, I guess I misunderstood the post. I thought you were talking about speeding up builds in a local dev context, not CI.
My company uses codefresh.io which provides Docker layer caching in builds, but that's not typical. With most CI (CircleCI, TravisCI, the dreaded Jenkins) you wouldn't get this out of the box. You would if you had your own permanent CI server I guess, but who the heck does that anymore? :D
I have a little experience in rolling my own Docker caching in CI. You can try using the --cache-from flag (docs.docker.com/engine/reference/c...), but it ends up being a bunch of extra scripting with probably a low ROI vs. doing as you suggest and building your own base image.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
I know multi-stage builds and I'm even using them to avoid secrets in Docker images. But a multi-stage build doesn't improve your build time ;-) With a multi-stage build, you can improve the size of a Docker image, but not the build time. And All the packages I'm adding to the base image I need in production. I can not leave out libxml for example, because then the HTML rendering would not work.
The method I described is really useful to improve the daily docker build time. Of course, you can combine it with docker multi-stage builds. That's something I'm doing as well. Will write another blog post about that topic.
Sorry, I don't follow. How does it not improve your build time? Are you not caching layers?
If you run your builds on a CI server, all the dependencies, in all docker stages, have to be installed on each build. Right? But if you refer to a base image that has already all the dependencies, it saves time because it only needs to be downloaded. Correct me if I'm wrong. Always willing to improve my knowledge :)
Ah, I guess I misunderstood the post. I thought you were talking about speeding up builds in a local dev context, not CI.
My company uses codefresh.io which provides Docker layer caching in builds, but that's not typical. With most CI (CircleCI, TravisCI, the dreaded Jenkins) you wouldn't get this out of the box. You would if you had your own permanent CI server I guess, but who the heck does that anymore? :D
I have a little experience in rolling my own Docker caching in CI. You can try using the
--cache-from
flag (docs.docker.com/engine/reference/c...), but it ends up being a bunch of extra scripting with probably a low ROI vs. doing as you suggest and building your own base image.