I'm building a Laravel app on my MacBook Pro with Docker by mounting my project folder inside a Docker container. I was working on an API endpoint ...
For further actions, you may consider blocking this person and/or reporting abuse
Cool concept. I've just started playing with a Docker/WSL environment and noticed how much slower it is than a native (non virtualized) Mac install.
Do you have any advice for moving the vendor folder with a Laravel Sail setup?
I didn't manually configure any of my containers so I'm not sure how the
WORKDIR
command would work in this setting.Oh man, I wish I did. Laravel Sail is what made me fall in love with Docker. I don't use it much anymore though because I'm configuring my own containers. Docker has some amazing qualities, but its performance issues on Mac/Windows sometimes means it's not the best solution for everything.
There is a solution, you can create your project inside wsl expect mounting it. if you look at sail installation guide it give you step by step guide
laravel.com/docs/master/installati...
I appreciate that you mentioned weaknesses of PHP framework compared to NodeJS
so you always have a copy of "verndor" outside the container and you have to keep it in sync with the "vendor" inside - manually? or did i miss something here ?
No, you didn't miss anything. It's janky. I don't love the solution I came up with in this article, but I considered my app unusably slow during development before I did this. You could always forgo the container on the host machine if you don't use your editor's IntelliSense. It eliminates the syncing issue at least 🤷♂️
okay, i understand. it´s different, but in the past i had similar problems with "code on host" but mounted into vmware, which makes everything terrible slow. so having code inside the VM is 500% faster. maybe you should take a look to laravel octane - because here, everything stays in memory and nothing gets loaded again and again for each new request. of course, it has some downsides and other things you have to take care of, but when it comes to speed, i guess octane will be much much faster than your current solution - even if the code IS NOT inside your container ;)
Thanks for the tip! I've been REALLY interested in Octane since Taylor announced it, but I haven't had a chance to dig into it yet. What are the downsides you've run into with Octane?
i am not using octance right now, but i am planning to use it for an api to create images on the fly. i have read a lot about it, but i cannot give u any detailed examples right now. in general: code that stays in memory (forever) is always a challenge - you have to write clean code and take a lot of care about everything - if i remember correctly, you should avoid static calls, because those can make problems. but you have to read some stuff about it, there is already plenty to read about it and also existing docker images, so you can start in a minute ;)
I spent a year writing Node.js, so I know some of the gotchas of things like global mutable state. Hopefully Octane will feel kind of familiar from that.
BTW, I guess statics are not a problem, it could be singletons, found this article, very interesting:
developpaper.com/first-experience-...
Also helpful: youtu.be/T5lkBHyypu8
Hello. Can you explain why putting the vendor folder one level up (not under /srv/app) make laravel faster ?
In my crurent set-up I'm putting the vendor folder as a subfolder (normal situation) but don't synchronize it on my machine.
My docker-compose file is creating a volume for the vendor folder so only in the container.
When I code, I'm using vscode and develop inside the container so I've well m'y vendor (auto-complete is working and I can debug too).
This said I'm really interested to hear more about your settings and how the performances are better.
Thanks
Moving the vendor folder into another directory that resides in the container itself is faster than accessing mounted volumes on Docker Desktop in Windows/MacOS because it doesn't have to cross between the host OS and Docker Desktop's Linux VM. The container is able to access files that only exist in the container much quicker than it can access files that are mounted from the host OS.
As far as my other performance optimizations in the Dockerfile, I enabled PHP OpCache. Kristoffer Högberg has a concise write-up on how to do this.
Oh? I think I've said the same... (I was on my smartphone, probably not the best to write a comment).
I'm putting my
~/vendor
folder only in the container; not on my host (in my case, Windows). I'm runningdocker-composer myapp composer update
in my container and I've created an internal volume to map thevendor
folder so stay in the container and is not synchronized on my host.My question was: why put you the folder not in the application directory but somewhere (so you've forced to update the
~/public/index.php
file). Using an internal volume achieve the same result without the need to hack some Laravel file likeindex.php
.For OpCache, yes, I've just played right now with it and the improvement is really nice (github.com/cavo789/docker_php_opcache), about 30% faster.
Can you give me an example of what you have in mind? I don't think I understand what you're trying to do.
I think it is complex and very hard to understand why putting vendor folder in another place will make the project faster.
In my case, I will put docker in a fast SSD location, then create a volume for the vendor folder. It will maintain your project structure as well as boost your project a little.
A few things:
The article quite often states "Docker's Linux VM", which confuses me. Does Mac have a "Linux subsystem" like windows, or are you running VirtualBox for a VM that Docker is installed within? Docker itself runs "containers" and not virtual machines. The two are quite different.
"When PHP receives a request, it loads all of its dependencies on a per-request basis. Once the request finishes it discards all of the loaded data." This is not quite the full truth. People should read this post on the Zend blog about op-caching and JIT: zend.com/blog/exploring-new-php-ji...
You may also wish to make sure your composer optimize-autoloader setting is set to true (which I believe the default Laravel installation sets by default). getcomposer.org/doc/articles/autol....
Finally, I was surprised to find out that running files from inside the Docker container is faster than running mounting files from the host OS (a "bind mount"). These are supposed to be faster than running from within the container because one is not dealing with the overlay filesystem. More info: docs.docker.com/storage/bind-mounts/. However, it looks like there may be a mac-specific issue going on here: github.com/docker/for-mac/issues/3677. I came across other articles raising this and it seems that others are making use of Mutagen to "resolve" this issue, so that you don't have to "fake it" with two different vendor directories, (one locally and one inside the container). This post has benchmarks and details on how to implement: accesto.com/blog/docker-on-mac-how.... The other post was: medium.com/netresearch/improving-p...
Thanks for checking out the article, Programster! You've got some great feedback here, and I've updated my article in a few places based on a few points you've raised.
1. Docker's Linux VM
Docker Desktop for Mac runs a Linux VM under the hood. If it didn't, then Docker wouldn't be able to run the containers: containers rely on Linux internals like cgroups.
I think you're right that I didn't do a great job clarifying that the VM is specific to Docker Desktop and not Docker Engine. I've changed everywhere that I said "Docker's Linux VM" to "Docker Desktop's Linux VM." Good looking out on this one–thank you!
2. PHP's request model
You're right that PHP can cache byte code between requests if you have PHP's OpCache installed and enabled. I omitted this from my description because OpCache & JIT are not enabled in PHP by default: you have to compile and configure the OpCache extension. Since they are not enabled by default, I consider the post's description of the PHP request model to be accurate. I did consider mentioning OpCache in this section when I wrote the post, but I thought it would have obfuscated the description.
That said, even if you enable OpCache you'll still see performance improvements by moving the
vendor/
directory: OpCache checks file timestamps for changes, and it can do the check quicker if the files it's checking are in the container.3. Optimize Autoloader
Yep,
"optimize-autoloader"
is set totrue
by default!4. Speed of mounted files
I think you're right that I didn't do a great job explaining that this trick with moving Composer's vendor directory really only helps with Docker for Desktop on MacOS and Windows. I added a second paragraph to the intro clarifying that this trick is MacOS/Windows specific.
Also, I'll take a look at Mutagen–thanks for the tip! And thank you for taking the time to write such a detailed response: I think this article is better because of the points you've raised.
Great article. One thing that you could do to improve performance would be by having Swoole installed and enabled. Very little change on your main codebase but the gains are unreal, 800% or thereabout. I have dockerized a PHP App with a real performance issue, developers were using a convoluted Laravel Homestead before and Opcache enabled didn't do much.Once we wrapped it with Swoole the entire App became very responsive, requests per second were amazingly fast.
I plan to try Laravel Octane at some point!
I had painfully slow response test for a large php application as well as laravel apps.
Setting my working_dir to be cached solved this issue.
Eg (docker-compose) :
working_dir: /var/www
volumes:
Thanks for the tip, Gavin! I'll have to give this a shot.
@tylerlwsmith I came to the internet to solve this same lag issue in the docker container I'm building and found your post. I found a solution that works. You need to create a named route, like so:
in
docker-compose.yml
:or, you should be able to create a named volume using
docker run
:credit to this blog post using this approach for
node_modules
burnedikt.com/dockerized-node-deve...Isn't it much more easier to leave all locations as is, and (NOT bind!) mount /srv/app/vendor as a volume in your docker compose file or define as a volume in your Dockerfile?
Then you'd not have to mess with ENV variables for the vendor dir, or modify your autoload location.
You could get a copy your vendor dir on your host machine by using docker cp or some sync tool.
Laravel Homestead is a great alternative and can use docker in the background.