Trellis\Ansible with dopy to provision DO droplet?

Did anyone already create a task for spinning up/creating a droplet using dopy with ansible or do you guys see a need for this?

Personally I love spinning up droplet using the DO API, even for small tests only, just as I want to be sure the system is clean and configured as I have specified it. I am therefore a heavy user of the DO API but lack of enough knowhow of ansible to get things done. Anyone got a wotking example?

  1. I personally keep all my repos seperate and don’t add them together like the roots example. So I will normally clone trellis from the roots repository. Next I remove the roots remote origin and add my own private repo as the new remote origin so I get to keep all the code comments instead of deleting the .git

Then I normally create 2 branches in my new trellis, one for upstream which I add the trellis repo as the upstream source and I keep that one current with roots/trellis then I create another dev-unstable branch where I do all my work to my trellis repo. Then when I’m happy with it I merge the dev-unstable to master and that is what I use to deploy with. Then you can continue adding unstable test code to your dev branch and your upstream branch stays in sync with roots, then you can merge in whatever you want from either branch into your master deploy branch.

  1. Bedrock should be your website and everything related to it should 100% live there. So normally your theme goes right in bedrock and becomes part of that repo, and your plugins should all get brought in as dependencies through composer.json

I normally name the bedrock repo the same name as my website so example.Com , I also normally follow the same flow as above with the 3 branches on my bedrock site repo. Private plugins should be brought in through composer.json in bedrock as well. Either with something like Toran Proxy or just saving them in a private repo and tagging the versions so you can keep that part straight as well.

So bedrock for me has my whole theme commited right in VC and plugins in composer. Then for uploads I normally just use the Human Made amazon S3 plugin in their github straight from development so all the uploads live in an s3 bucket or if I’m not using s3 or google cloud storage I will just keep them in sync with rsync, but using s3 is a better option for me at least.

Then the database same thing I normally just use Google cloud sql right from development so my database stays safe and backed up there always.

Pretty much if you do it like that or similar you can never lose anything or forget to dump out your database and destroy your VM with the only copy. Did that once and then that ended local databases for me forever haha