Deploying via AWS OpsWorks and Octopus

Moha Alsouli
3 min readOct 4, 2019

--

I found this post in my old notes.. that I was planning to publish… 3 years ago! Still relevant though, so here it is slightly edited if anyone is looking for tips in this topic:

At Tigerspike, We Solve Problems in Remarkable Ways.

One of our goals, as a global digital products company, is to bring the best user experience and that includes bringing the best experience to our engineers.

So, for one of our builds a few years ago, the client wanted to host and manage a set of Windows servers, running a .Net API, on AWS OpsWorks.

AWS OpsWorks Stacks is a great service that takes care of servers configuration management and automates various tasks you would traditionally need an entire support team to do, such as auto healing, provisioning new resources and application deployments.

Of course, we sketched an HLD, proposed to the client, got approvals and in no time we provisioned the bespoke hosting environment. The usual stuff.

What was tricky though was the requirement to:
- use Jenkins for CI, and
- use Octopus for CD.
I won’t get into what was the business requirement, but we all know Jenkins is pretty solid (and popular) automation server and Octopus Deploy makes lifecycles and release management a breeze with its neat UI so why not?!

Anyway, how did we link it all together?

Well, it’s easy! The repo triggers a Jenkins job, Jenkins builds and pushes to Octopus, then Octopus pushes to AWS according to the parameters.
Something like this:

Getting the code repository to trigger a Jenkins job isn’t magic nor getting Jenkins to build and test on an EC2 Spot Instance then triggering an Octopus project. There are plugins, docs and whitepapers for that.

What’s custom though is automating Octopus to deploy via AWS OpsWorks.
It’s not tricky, it just took a couple of hours to create a custom Octopus step template with a Powershell script utilising AWS CLI to do the work. And before I show you the code snippet, let me highlight this workflow again:
- Developer pushes code to code repository (e.g. Bitbucket/Gitlab);
- Code repo triggers a Jenkins job;
- Jenkins test the .Net project then build, package and push to Octopus Deploy;
- Octopus unpacks in a temporary directory then zips the raw files in a .zip folder (using your custom scripts, default step template or templates from the community library);
- Octopus then uploads the zipped package to an S3 bucket (again using your custom scripts or templates from the community library);
- Finally, Octopus triggers OpsWorks deployment using AWS CLI, and check until a result is returned (using the custom script below).

Of course, it’d be better to create a custom Step Template in your Octopus server with this script and the required parameters so you can reuse for multiple stacks and projects, especially if you have multiple environments, e.g. Dev, UAT and Prod, and you want to use the project’s variables to dynamically get the parameters and deploy.

And as you may have guessed, you will need an IAM user for Octopus, with permissions to write to the S3 bucket and permissions to create and check deployments in the OpsWorks stacks you’re deploying to.

Hope that helped you!

--

--

Moha Alsouli
Moha Alsouli

Written by Moha Alsouli

Cloud Specialist | Occasional blogger

No responses yet