Developing with Symfony 2 and Docker

I had a couple of aims for my most recent quick project, Dashli:

  • be able to avoid polluting my machine with PHP, MySQL, and those other services which quickly get littered around,
  • be able to really quickly deploy to a service, without having to ssh in and set everything up

Docker works well with both of these points.

There’s a simple Dockerfile which lets you write down the steps you’d take to build a box and set it up for your application. That means that during development, Docker knows how to build a virtual machine which’ll let me keep the entire server infrastructure hidden away from my “local” OS. No more worries about which PHP version I need for different applications and I don’t need to keep a MySQL server running for no good reason.

Together with tools like Docker Cloud (free with one private instance) and Digital Ocean (not free, but you can get $10 – enough for two months of a small server – when you sign up with this link), it’s super easy to have my project uploaded to a cloud instance and running without me even having to know how to SSH into the box.

I had this all set up and running brilliantly. The problem was when I came back to my machine to continue development.

Tiny change, rebuild. Tiny change, rebuild.

The most frustrating part of this new flow for me was finding a simple typo and then having to redo my docker build . step, which isn’t quick since it has to do a full composer install again.

The fix here was mounted volumes – let the virtual machine mount my project directory as nginx’s root directory. This is simple, actually. When running your docker, do this instead:

docker run --publish 80:80 --volume /local/project/path:/virtual/machine/path/to/project .

Now, when you update a file in your local project, it’ll be instantly reflected inside the docker. Refresh your browser and you’ll notice the updates.

composer files getting thrown away

You’re probably building your composer.phar install inside your Dockerfile. That will still happen. However, when Docker goes ahead and swaps out the VM’s /virtual/machine/path/to/project with your local one, it basically deletes all that is in there. That will include you vendor/ folder, which will now be empty.

That’s obvious why – you’ve never run composer locally, you don’t even have PHP installed locally so how could you! You need composer to be run within the Docker, and hopefully without having an affect on your local machine.

To do this, we can move our vendors outside of the project. I know, right. Weird. Composer has support for this though. In my composer.json you’ll seen this:

"config": {
"bin-dir": "bin",
"vendor-dir": "/tmp/dashli/vendor"
}

This tells composer to download the vendors in the /tmp/dashli/vendor directory (I chose /tmp/ as everyone can write to it – this might not be the smartest place to put it). I also had to edit my app/autoload.php so Symfony knows where the autoloader is.

$loader = require '/tmp/dashli/vendor/autoload.php';

Now your dependencies can be installed on the virtual machine, out of the way of your local machine.

Getting around root created files

The huge downside of this is that any file written by www-data user inside your container is actually going to be written locally by root. You’ll now find your local project’s var/cache/, var/session/, etc. full of files owned by root. You’ll have to sudo rm them to get rid of them, which is not something that is a part of a healthy development flow.

Symfony comes fully equipped to handle this problem though: in your app/AppKernel.php, you’ll have to change some of the overridden methods to point them to your /tmp/ directory, just like we did with composer dependencies above.

You can actually read more about this here on the Symfony docs.

Hopefully that leads you to an easy development experience with Symfony and Docker!

Leave a Reply

Your email address will not be published.