After reading this article, I started thinking more about my WordPress deployment process.
Like everyone, I started out with FTP. I got the job done without a whole lot of fuss. When I finished developing a site, I simply uploaded the entire folder and I was done.
When my development process matured a bit more, I needed to maintain sites and make more frequent changes. This is where I got started with Git. FTP became a slight problem because I was never sure which files I needed to upload. So I was forced to upload the entire folder just to ensure that all my changes made it to the server. As sites got bigger, this became a bigger pain in the butt.
Since Git had worked so well for me in the past, it was a matter of time before I got started with the Git Workflow. I'll have to admit, for the site I got set up on this, it worked great. The reason this worked fine is that all the sites I managed and my Git repository were on the same server. I won't got through the issues on installing Git on a shared host. When adding a new site to my workflow, I would have to do the following:
- Add the new Git repository to the server
- Modify the post-receive script
Again, this worked great. Everything was integrated and I could deploy on the command line. I was really proud of myself. Then came one client that had their own server. Installing Git was simply not an option. Even if I did, I would have to jump through the hoops of SSH keys from their server to mine. It just didn't work. I was back to the stone age of good old FTP.
Fabric essentially gives you an easy way to run commands on a remote host. I like it because it doesn't require much on server the side. I am using their rsync plugin and that's practically everywhere. Rsync handles my issues of only pushing things that have changed; something I got to like with Git.
So enough talk, here's my fabfile.py script:
from __future__ import with_statement from fabric.api import * from fabric.contrib.project import * # Deployment settings dir = '~/www/wp-content/' exclude_sync = ('uploads', '.DS_Store','.git', '*.pyc', '*.py', 'build', '.htaccess', 'wp-config.php', '.sass-cache', '*.log', '*.tmp', '*.bak', '*.sublime-*' , 'cache', 'infinitewp', 'upgrade') env.hosts = ['site.com'] env.user = 'username' env.password = 'password' local_dir = '/xampp/wordpress/site/wp-content/' def deploy(): local('compass compile -e production --force') rsync_project(remote_dir = dir, local_dir = local_dir, exclude = exclude_sync, delete = False)
I run this by typing
fab deploy on the command line. That's it. A quick summary of my script:
- You set up some configurations
- Set up your command
As you can assume the local command runs scripts locally. Here, I'm compiling my SASS scripts for deployment. Other minifying stuff could go here also and other clean up.
The rsync_project command is a wrapper for the rsync command that runs on the server. If you don't need rsync, you can call the run command to run stuff on the server.
Fabric is very flexible that way. If you don't like or need the rsync command, you can use Git if you like. You would use local to run git commit, git push; then use run to do a git pull on the server. That way it's very similar to the git workflow. The reason I prefer this is that everything is self contained in the fabfile. I'm not having to juggle multiple post-receive scripts everywhere.
Hope the simplicity will help you guys.