Is this an OK git workflow? If not, any chances of optimizing it for me needs?

Since I don’t have a much Git experience I’m looking for some viewpoints in regard to my upcoming Git workflow. My projects are mostly rather small (300 visitors a day) and some a little more and I’m the only person working on them. However, I want to implement a flow that make others take part in the development when I need to and it was also important for me to be able to continue work at home, off work, when I feel like it.

  1. Local development. I’m usually setting up Wordpress and its database locally. The question is wether or not I should connect to the database on a staging or maybe even live server instead, to save time from moving DB manually later?

  2. Theme is pushed to a bare repo on my VPS and I let Jenkins push it to the to the live server. Me or other developing partners can also pull the latest code from VPS to their machine and continue work. Problem: I don’t know if it’s possible in Jenkins to only trigger/push on a specific branch (master working branch).

Do you think this is an OK workflow for small businesses like this? Or is it even overkill?

I’ve considered deploy via FTP in stead, and then via GitHub, but again a problem with Jenkins firing on all branches.

I use DB Migrate Pro for syncing the database between development, staging, and production and I use dploy.io to deploy my code.

I keep only my theme folder under VC. My remotes live in Bitbucket, which I choose because of the unlimited, free, private repos.

Bitbucket holds my master repo.

Locally I use gitflow, so I’m generally doing my work in a develop or feature branch, which is then periodically merged into master.

dploy.io is connected to my Bitbucket repo, so when I push to master from my local repo, the code gets deployed to the live site.

When I’m working with someone else, usually my friend in New York, our workflow looks something like this -

  1. Communicate through Slack, so we know what we’re working on.
  2. If necessary, pull the DB from the remote site with DB Migrate Pro
  3. Pull latest code down from master, merge into develop
  4. Do cool stuff
  5. Merge develop branch back into master, push to the remote
  6. dploy deploys the code, either automatically or by passing a flag in the commit message, or by logging into the dploy interface.
  7. If necessary, push local DB to remote with DB Migrate Pro

No idea if that helps at all, but this work flow has been pretty great for me.

2 Likes

Thanks a bunch for your answer, @smutek . I really needed a second opinion and a working example from someone deeper into business than I am. So yeah, this helped.

Though I have some practical question regarding the deploy process:
Dploy.io seems like a very nice tool, but the product is relatively pricey (I’m not cheap, so don’t misunderstand me – and I will buy it if I can’t go through with my plan) for something that potentially could be done with post-receive hooks in a bare repo. Imagine using something like https://github.com/noelboss/deepl.io like a hub on a VPS server and let that server handle all repos/branches hence have rules for each site/customer. Then re-deploy to to the specific customers domains unique web site.

What do you think of this idea? Maybe this is a total detour? Or do you use this method with your one free FTP connection on Dploy?

Lets say you have 100 customers, everyone with a web page. 50 requires updates daily and the other 50 a couple of updates yearly. Would that qualify for a bigger Dploy plan? Expensive for customers with less updates, but in sake of efficiency, all customers should have their own repo on Dploy. If I must juggle between 100 customers and only 50 Dploys that seems quite exhausting.

Sorry for making a problem that probably not exists, but hypothetically :slight_smile: Where should the limit go?

I guess should take into account that making post-receive hooks manually is a pain in the ass and when many customers reside on shared hosting there will always be problems with mod_rewrite , php exec and what’not.

Conclusion: good quality cost?

Hey raffen, no problem at all. :smile:

Totally. If you have the ability to get this running easily then it would definitely make sense to go the DIY route, in my view. I tried to set up git deployment about a year ago and at the time it was over my head.

I have the 10 repo dploy plan at $15 per month. I’m a solo freelancer and I never have more than 5 active projects at any one time, so the 10 repos are plenty for me to use for active projects and still have repos left for personal projects.

In fact, right now I am only using 2 of my 10 repos. :smile:

When a job is archived I remove it from the dploy account, so that I can rotate new stuff in.

I don’t have any sites that require updates daily, but for those odds and ends that come in, maintenance tasks, small tweaks, etc. I’ll just use FTP. If it’s a larger task I’ll set it up in dploy again. General rule of thumb is that if it’s something I can knock out in an afternoon or so, I’ll use FTP. If it’s something that’s going to be ongoing I’ll just set it back up in one of my empty repos.

I keep just about everything in VC anyway, so it’s just a matter of plugging the info back into dploy, which takes about a minute, and reconnecting to the remote. (I shot a quick demo a while back, if you’re interested)

Yep, exactly. I don’t always get to do nice Sage projects, plenty of times I’m stuck working with some bloated theme hosted on an overloaded, underpowered GoDaddy server, or whatever.

dploy is fast and easy to get going, it works everywhere, and with everything.

As far as the value proposition goes, YMMV, but it definitely provides value for me and is worth the cost. Even in times where I am under-utilizing my plan, which is often, it’s good to have the peace of mind in knowing that I have this nice tool at my disposal.

Hope this helps!

@smutek, this helps a lot :slight_smile: You helped me get git further under my skin. I signed up at Dploy to try it and I must say it works really great. I might do a side job with the VPS “git hub” idea though, then I can learn some bash and Linux at the same time.

I’m going to digest this for a while. I’ve been researching a lot the last week regarding the best workflow for me, so kind of tired of it right. Hope it’s OK I ask a question or two at a later time.

Happy gitting :sunny:

1 Like

Hey, I’m the developer of Deepl.io and I would be happy to know what your expirience is when trying it and also to assist with problems. As of now, it’s best to install Deepl.io on a subdomain as a subdirectory installation doesn’t seem to work. I have Deepl.io working for multiple projects, repos and branches… It’s fast and reliable. If you know how to do shell-Scripts, I think it’s really easy…

Ps. I’m planing on using Deepl.io in conjunction with DB Migrate Pro and probably http://revisr.io – Still don’t know how to savely track DB changes made by WP-Plugins during updates (for a network installation)

2 Likes

Hi @noelboss. Thank you for your help offer. I’m aware of sub directory problem, as I’m the one who posted the issue on Deepl.io on GitHub. Unforunately I haven’t had the time to try it out on a subdomain yet, but hopefully I will do so in the near future. Then I will keep you posted.

This is the first time I hear about revisr.io. Is it good? Are there any clear benefits over local git and some good deploy provider?

Hey Raffen,
Revisor gives you a graphical git-interface on you WP install – with some DB options as well. In my case it was not usefull since the wp-plugin sturcture for networks is so limited. Might use it on a standalone install now… lets see…
Subdirectory install; Try to include the subdirectory in your secret key, might work… like;
“secret”: “subir/secter-key”,