I'm building a Laravel app on my MacBook Pro with Docker by mounting my project folder inside a Docker container. I was working on an API endpoint in the app that had a response time of a little more than a second. This made the interactivity in a connected React app feel painfully slow.
This article will show you how to increase the speed of a containerized Laravel app during development on MacOS and Windows by moving Composer's vendor directory to inside the container.
Why Laravel is slow in Docker
The reason behind this slow performance is two-fold, caused by the combination of the PHP request model and the latency when transferring data between Docker Desktop's Linux VM and the host MacOS machine.
When PHP receives a request, it loads all of its dependencies on a per-request basis. Once the request finishes it discards all of the loaded data. This is different than something like Node.js, where a single thread handles all requests and each module is cached when it's first loaded.
PHP's way of loading dependencies is already inefficient compared to Node. And when you do Docker development on a non-Linux machine, you add the overhead of crossing between Docker Desktops's Linux VM and the mounted host machine folder for every single dependency file that's loaded. It's the difference between moving books from one shelf to another vs moving books to a shelf in a house down the street.
To keep Docker fast, we want to minimize the amount of times we need to cross between Docker Desktops's Linux VM and the host machine. We can do that by storing Composer's vendor/
folder inside the container instead of in the mounted project directory.
Moving the vendor directory
For the rest of this post, we'll assume that you have a container where your app is stored in /srv/app/
. We will install the Composer dependencies in /srv/vendor/
.
In your Dockerfile, we'll install the Composer dependencies using the RUN
command below:
# ...previous Dockerfile commands
WORKDIR /srv/app
COPY . .
RUN COMPOSER_VENDOR_DIR="/srv/vendor" composer install
This will install the Composer dependencies in the /srv/vendor
directory, but Laravel can't see them: it expects its dependencies to be in the project root's vendor/
folder. We must update the places where Laravel loads the Composer autoload file.
In public/index.php
and artisan
, find the following line:
require __DIR__.'/vendor/autoload.php';
And replace it with:
require __DIR__.'/../vendor/autoload.php';
Also update the path in phpunit.xml
.
Run docker build
(or docker-compose build
if you use Compose), and bring the container back up. When you navigate to your Laravel project in the browser, you should see noticeably faster page-load speeds. In my project, my API request drop from about 1000ms to about 200ms.
Gotchcas
The faster page-load speeds are nice, but if you use IntelliSense & autocomplete, you're really going to want your dependencies in your mounted project directory so that your editor can see them.
You can install your Composer dependencies in your mounted project directory by running the following command from your host machine while the Laravel container is running:
docker exec your-container-name composer install
Another "gotcha" can happen when installing new dependencies using the composer require
command. You may occasionally run into an error like the following:
Class "Laravel\Breeze\BreezeServiceProvider" not found
Script @php artisan package:discover --ansi handling the post-autoload-dump event returned with error code 1
Installation failed, reverting ./composer.json and ./composer.lock to their original content.
This error occurs because when the installation completes, it tries to load the Laravel app–which is looking at the vendor/
folder inside the container. When installing dependencies, it may be best to run the composer require
command twice: once for the vendor folder in your container, and once for the directory that is shared with your host.
# Install in the container first
COMPOSER_VENDOR_DIR="/srv/vendor" composer require [package-name]
# Then install in the directory shared with your host
composer require [package-name]
Maybe rebuild your image after this too: if you're using Docker Compose then your changes won't persist once you stop your container.
Further reading
In addition to moving your dependencies into your container, you can also enable PHP's OpCache to make Laravel load even faster. Kristoffer Högberg has a concise write-up on how to do this.
Michaël Perrin wrote a post called "3 ways to get Docker for Mac faster on your Symfony app" that has some interesting performance measurements before and after optimization. Michaël's post was the inspiration behind this article, and it's definitely worth a read.
Alternatively, a few people have mentioned that you might be able to sidestep these Docker-related performance issues entirely by using Octane, so if you need the best possible performance then take a look at that as well.
Addendum
I've been using this setup for about ten days now, and I feel compelled to give a clear and unambiguous warning: moving your vendor folder causes problems.
This setup caused php artisan test
to completely stop working in my CI. Running xdebug with this setup is nearly unusable. I'm actively working around these issues so that I can enjoy the performance benefits, but if you use this configuration you will run into problems. However, you may still decide the trade-offs are worth it.
Top comments (26)
Cool concept. I've just started playing with a Docker/WSL environment and noticed how much slower it is than a native (non virtualized) Mac install.
Do you have any advice for moving the vendor folder with a Laravel Sail setup?
I didn't manually configure any of my containers so I'm not sure how the
WORKDIR
command would work in this setting.Oh man, I wish I did. Laravel Sail is what made me fall in love with Docker. I don't use it much anymore though because I'm configuring my own containers. Docker has some amazing qualities, but its performance issues on Mac/Windows sometimes means it's not the best solution for everything.
There is a solution, you can create your project inside wsl expect mounting it. if you look at sail installation guide it give you step by step guide
laravel.com/docs/master/installati...
I appreciate that you mentioned weaknesses of PHP framework compared to NodeJS
so you always have a copy of "verndor" outside the container and you have to keep it in sync with the "vendor" inside - manually? or did i miss something here ?
No, you didn't miss anything. It's janky. I don't love the solution I came up with in this article, but I considered my app unusably slow during development before I did this. You could always forgo the container on the host machine if you don't use your editor's IntelliSense. It eliminates the syncing issue at least 🤷♂️
okay, i understand. it´s different, but in the past i had similar problems with "code on host" but mounted into vmware, which makes everything terrible slow. so having code inside the VM is 500% faster. maybe you should take a look to laravel octane - because here, everything stays in memory and nothing gets loaded again and again for each new request. of course, it has some downsides and other things you have to take care of, but when it comes to speed, i guess octane will be much much faster than your current solution - even if the code IS NOT inside your container ;)
Thanks for the tip! I've been REALLY interested in Octane since Taylor announced it, but I haven't had a chance to dig into it yet. What are the downsides you've run into with Octane?
i am not using octance right now, but i am planning to use it for an api to create images on the fly. i have read a lot about it, but i cannot give u any detailed examples right now. in general: code that stays in memory (forever) is always a challenge - you have to write clean code and take a lot of care about everything - if i remember correctly, you should avoid static calls, because those can make problems. but you have to read some stuff about it, there is already plenty to read about it and also existing docker images, so you can start in a minute ;)
I spent a year writing Node.js, so I know some of the gotchas of things like global mutable state. Hopefully Octane will feel kind of familiar from that.
BTW, I guess statics are not a problem, it could be singletons, found this article, very interesting:
developpaper.com/first-experience-...
Also helpful: youtu.be/T5lkBHyypu8
Hello. Can you explain why putting the vendor folder one level up (not under /srv/app) make laravel faster ?
In my crurent set-up I'm putting the vendor folder as a subfolder (normal situation) but don't synchronize it on my machine.
My docker-compose file is creating a volume for the vendor folder so only in the container.
When I code, I'm using vscode and develop inside the container so I've well m'y vendor (auto-complete is working and I can debug too).
This said I'm really interested to hear more about your settings and how the performances are better.
Thanks
Moving the vendor folder into another directory that resides in the container itself is faster than accessing mounted volumes on Docker Desktop in Windows/MacOS because it doesn't have to cross between the host OS and Docker Desktop's Linux VM. The container is able to access files that only exist in the container much quicker than it can access files that are mounted from the host OS.
As far as my other performance optimizations in the Dockerfile, I enabled PHP OpCache. Kristoffer Högberg has a concise write-up on how to do this.
Oh? I think I've said the same... (I was on my smartphone, probably not the best to write a comment).
I'm putting my
~/vendor
folder only in the container; not on my host (in my case, Windows). I'm runningdocker-composer myapp composer update
in my container and I've created an internal volume to map thevendor
folder so stay in the container and is not synchronized on my host.My question was: why put you the folder not in the application directory but somewhere (so you've forced to update the
~/public/index.php
file). Using an internal volume achieve the same result without the need to hack some Laravel file likeindex.php
.For OpCache, yes, I've just played right now with it and the improvement is really nice (github.com/cavo789/docker_php_opcache), about 30% faster.
Can you give me an example of what you have in mind? I don't think I understand what you're trying to do.
I think it is complex and very hard to understand why putting vendor folder in another place will make the project faster.
In my case, I will put docker in a fast SSD location, then create a volume for the vendor folder. It will maintain your project structure as well as boost your project a little.
A few things:
The article quite often states "Docker's Linux VM", which confuses me. Does Mac have a "Linux subsystem" like windows, or are you running VirtualBox for a VM that Docker is installed within? Docker itself runs "containers" and not virtual machines. The two are quite different.
"When PHP receives a request, it loads all of its dependencies on a per-request basis. Once the request finishes it discards all of the loaded data." This is not quite the full truth. People should read this post on the Zend blog about op-caching and JIT: zend.com/blog/exploring-new-php-ji...
You may also wish to make sure your composer optimize-autoloader setting is set to true (which I believe the default Laravel installation sets by default). getcomposer.org/doc/articles/autol....
Finally, I was surprised to find out that running files from inside the Docker container is faster than running mounting files from the host OS (a "bind mount"). These are supposed to be faster than running from within the container because one is not dealing with the overlay filesystem. More info: docs.docker.com/storage/bind-mounts/. However, it looks like there may be a mac-specific issue going on here: github.com/docker/for-mac/issues/3677. I came across other articles raising this and it seems that others are making use of Mutagen to "resolve" this issue, so that you don't have to "fake it" with two different vendor directories, (one locally and one inside the container). This post has benchmarks and details on how to implement: accesto.com/blog/docker-on-mac-how.... The other post was: medium.com/netresearch/improving-p...
Thanks for checking out the article, Programster! You've got some great feedback here, and I've updated my article in a few places based on a few points you've raised.
1. Docker's Linux VM
Docker Desktop for Mac runs a Linux VM under the hood. If it didn't, then Docker wouldn't be able to run the containers: containers rely on Linux internals like cgroups.
I think you're right that I didn't do a great job clarifying that the VM is specific to Docker Desktop and not Docker Engine. I've changed everywhere that I said "Docker's Linux VM" to "Docker Desktop's Linux VM." Good looking out on this one–thank you!
2. PHP's request model
You're right that PHP can cache byte code between requests if you have PHP's OpCache installed and enabled. I omitted this from my description because OpCache & JIT are not enabled in PHP by default: you have to compile and configure the OpCache extension. Since they are not enabled by default, I consider the post's description of the PHP request model to be accurate. I did consider mentioning OpCache in this section when I wrote the post, but I thought it would have obfuscated the description.
That said, even if you enable OpCache you'll still see performance improvements by moving the
vendor/
directory: OpCache checks file timestamps for changes, and it can do the check quicker if the files it's checking are in the container.3. Optimize Autoloader
Yep,
"optimize-autoloader"
is set totrue
by default!4. Speed of mounted files
I think you're right that I didn't do a great job explaining that this trick with moving Composer's vendor directory really only helps with Docker for Desktop on MacOS and Windows. I added a second paragraph to the intro clarifying that this trick is MacOS/Windows specific.
Also, I'll take a look at Mutagen–thanks for the tip! And thank you for taking the time to write such a detailed response: I think this article is better because of the points you've raised.
Great article. One thing that you could do to improve performance would be by having Swoole installed and enabled. Very little change on your main codebase but the gains are unreal, 800% or thereabout. I have dockerized a PHP App with a real performance issue, developers were using a convoluted Laravel Homestead before and Opcache enabled didn't do much.Once we wrapped it with Swoole the entire App became very responsive, requests per second were amazingly fast.
I plan to try Laravel Octane at some point!
I had painfully slow response test for a large php application as well as laravel apps.
Setting my working_dir to be cached solved this issue.
Eg (docker-compose) :
working_dir: /var/www
volumes:
Thanks for the tip, Gavin! I'll have to give this a shot.
@tylerlwsmith I came to the internet to solve this same lag issue in the docker container I'm building and found your post. I found a solution that works. You need to create a named route, like so:
in
docker-compose.yml
:or, you should be able to create a named volume using
docker run
:credit to this blog post using this approach for
node_modules
burnedikt.com/dockerized-node-deve...Isn't it much more easier to leave all locations as is, and (NOT bind!) mount /srv/app/vendor as a volume in your docker compose file or define as a volume in your Dockerfile?
Then you'd not have to mess with ENV variables for the vendor dir, or modify your autoload location.
You could get a copy your vendor dir on your host machine by using docker cp or some sync tool.