Back in late 2018, Docker had piqued my interest as having the potential to eliminate the source of one of my biggest pain points as a developer - managing development environments.
Although I had known about Docker and Docker Compose for several months, my knowledge in setting up a fully-functioning environment was lacking. I knew Docker could help me specifically with managing multiple versions of PHP and Node, as well as remove the need to have other environment-specific software and dependencies installed directly on my computer. I knew I could codify the server configuration and have containers running software and code that could be started and stopped with ease whether I was working on Mac OS or Windows, or even a remote Linux server. What I was missing was the knowledge of how to do all of this properly.
At this time, I was also wanting to start developing applications using Laravel, so I looked for solutions specifically geared to Laravel and Docker. What I found was an open-source project containing a collection of Docker images designed to work specifically with Laravel applications called Laradock. Eagerly, I copied the repo and ran docker-compose up -d
failing to realize that this would build every single image in the project and launch its container ... and there were dozens of them. More than half an hour must have gone by before I realized my mistake and re-ran the command to launch only the containers I was interested in.
While reading through the documentation, I was introduced for the first time to the concept of splitting the software required to run an application into multiple containers, or services, where you would have one container running MySQL and another running Nginx and another running PHP (actually two for PHP in this case). Before, my plan had been to combine everything into one container but after seeing how Laradock was doing it, I thought maybe breaking things up would give me more options, especially in a production environment.
To get started setting up my own Laravel project with Docker, I cherry-picked only the services and files I needed to avoid cluttering up my project. I also installed Laravel in the src
directory which was mounted as a volume in the Workspace container. This is what it looked like.
docker/
apache/
mysql/
nginx/
php-fpm/
workspace/
docker-compose.yml
env-example
src/
app/
bootstrap
config/
..
Even though I preferred Nginx at the time, I also included Apache so I could accommodate some projects I was doing for clients.
To start and stop my Docker containers, I would run the following:
cd docker# Start servicesdocker-compose up -d nginx mysql# Stop servicesdocker-compose down
I later changed this so that I would run separate bash commands for starting and stopping the servers, because for some reason I thought this was too much to type.
Throughout my first few months of experimenting with Docker, I used it to rebuild my website, then I migrated some client projects to use Docker locally, and then I used it to develop my issue tracker, Ishuro, which I started in November of 2019. Even though I knew my knowledge of Docker's capabilities was still very limited at this time, I felt it was a major improvement from what I was doing before.
Fixing the Problems I Created
Fast-forward one year and I had accumulated a list annoyances with how I was using Docker locally. These are some of the main problems I encountered.
- Switching from projects was painful. I'd have to check if I had containers running, navigate to the correct project directory, shut them down, and then start the containers for the project I wanted to work on whenever switching between projects. This was especially tiresome when I started developing Laravel packages and was using one environment for developing the package and another for installing and testing it. I'd have to switch projects whether I was installing a package with Composer, compiling assets with NPM, or just wanting to view the project in the browser.
- Changes were not synchronized. Whenever I wanted to make a change to an image or a config file for all my environments, I had to duplicate that change across multiple projects. There was always the potential of one environment being different from another whether or not it was intentional. Managing multiple environments started to feel really unnecessary when there wasn't much benefit to doing so and they were all basically running with the same requirements.
- MySQL data was getting corrupted. Something was causing MySQL's data files to become corrupt when switching between projects which caused the container to crash whenever I tried to start it. I had to create separate data directories for each project that used MySQL to prevent this problem from occurring. I suspected I was doing something wrong, but I couldn't figure out what it was and I didn't want to spend a lot of time trying to solve it.
Except for maybe client projects, I really didn't have a good reason to keep my projects in their own isolated environments. So I started thinking about combining them into a single environment. I believed this would not only solve my problems, but simplify my development process overall.
Switching to One Docker Environment
Making the switch to one Docker environment for most of my projects - meaning using one set of containers or services - effectively solved all of these problems. I no longer had to deal with synchronizing changes because there was only one set of files to have to update, which made updating to Composer 2 for all my projects a breeze. I also didn't have to deal with the MySQL problem because there was only one MySQL container. I also forked the Laradock repo (as they suggest) instead of merely copying over what I needed and installed it as a submodule in my new default development repo which I called LCMP (Linux, Caddy, MySQL, PHP). I switched to Caddy because it made enabling HTTPS really easy in production.
Within LCMP I added a proj
directory which was ignored by Git. Here I cloned all of my project repos locally. This entire directory gets mounted as a volume in the Workspace container.
After updating my Caddyfile
and hosts
file, each project became accessible from its own domain name. My website's development environment, for example, was made accessible from http://aa.test
. I shortened my project directory names because I was tired of having long file paths take up more than half my terminal.
My new directory structure looks something like this:
laradock/
proj/
aa/
ish/
...
Using Docker Now
After using one Docker environment for developing my projects for two weeks now I can say that overall I've made a great improvement in my own developer experience and have no desire or need to go back to one environment per project, at least not for development.
I'm okay with each project having its own production environment when the time comes. That's where I see the maintenance overhead caused by decoupling environments to offer practical benefits and not be an unnecessary burden. So far I only have one such project and that's Ishuro.
In some cases, however, I've found it's better not to have to manage a production server at all, as is the case with this website which is built with Nuxt and hosted by Netlify. Even then I can benefit from using Docker. I develop it locally using my Workspace container where I have Node and NPM installed, I test it in the browser, and then when I'm ready, I commit my changes and they get deployed automatically. I don't think I could make the process simpler.