Composer as part of deploy

I’ve run into an issue with a private (licensed) WP plugin and deploys. Doing a composer install or update locally works just fine, but on my staging server, composer installs an add-on to the plugin, not the plugin itself. (Instead of installing WP Migrate DB Pro, it installs WP Migrate DB Pro Media Files, which isn’t even in composer.json.)

That’s not why I’m posting though; I’m in touch with the WPMDB guys for support, but the whole issue begs the question: why run composer on the server at all? Why not just have Capistrano copy the compiled files (and WP itself, and plugins) to the server, instead of having it run composer?

In noodling around with this problem, I’ve come across multiple resources that say not to run Composer on the server. Just wondering if there’s a philosophical reason for running it remotely, since it adds a dependency on the server and some complexity to the deploy itself.

If this has been addressed already, please point the way; couldn’t find anything when I searched.

1 Like

No, I don’t think it matters. There’s a few reasons why I’ve set it up that way:

  1. It’s just how a lot of services/deployment libraries do it (things like Heroku etc).
  2. It’s how Capistrano does it by default.

Although just thinking about it off the top of my head it’s probably faster in most cases to run it on the server since it’s downloading packages with a lot of bandwidth vs uploading a (potential) ton of files with most likely low bandwidth.

But there is a reason why this was historically done in a tool like Capistrano. It was created for deploying Rails app and with Rubygems they can include “native extensions” which need to be compiled. Which means you can’t just compile them locally. It needs to be done on the target server.

Normal PHP packages don’t have that capability. PHP does have extensions (think PEAR and packages like php-memcached) which people usually install through apt. I’ve seen a proposal to install these through Composer, but right now you can’t.

So as far as I know, there’s no reason why you can’t run Composer locally and then upload the files.

That makes sense, if it’s common practice (though the Rails thing seems vestigial at this point). I just started thinking about the 12-factor app thing (one build, many deploys), and the server-side composer dependency seems to go against that grain. (There’s no guarantee the server will have access to composer, etc.)

The more I read about things like this, the more a solution like Docker makes sense: all the dependencies, including OS-level dependencies, wrapped up in a neat little package.

Except it’ll take me awhile to figure out exactly how to do that while maintaining the directory structure from Bedrock-Ansible, haha.

(But this is how we learn and the reason I’ve used these tools on every project the last few years)

Bigsweater - If you were able to deploy bedrock and sage to a docker instance (hopefully with Docker-Compose,) I’d definitely like to look around if you’ve posted that setup anywhere.

Recently I had a client with a small budget want me to put a site together and I wanted Docker+Bedrock+Sage.

I spent so much time trying to figure it out that I ended up hacking a pre-made theme down just to deliver the project on time … I was disgusted in myself but had to deliver the project in a tight timeline.