CI/CD: CodePipeline vs Jenkins Pipeline vs Bitbucket Pipelines

Moha Alsouli
4 min readApr 27, 2021
What tool do you use to orchestrate your CI/CD?

While helping a Product Owner set up for success recently, their Corporate IT asked us why did we choose CodePipeline over Jenkins Pipeline as the CI/CD orchestration tool because all of their other, slightly older, business units use Jenkins Pipelines. Well, here is why…

Why do I prefer CodePipeline?

  • CodePipeline is an AWS managed service, meaning, it does not require management nor maintenance-overhead once it has been set up. Jenkins server on the other hand requires ongoing management of Jenkins itself, its plugins, integrations and the hosting OS (e.g. Linux or Windows).
  • CodePipeline is Highly Available and reliable with 99.9% SLA. Jenkins server’s SLA depends on its architecture and dependencies so it varies but please let me know if you have it set up with 99.9% SLA.
  • CodePipeline only costs $1.00 per active pipeline per month. An active pipeline is any pipeline that has run at least once in the billing month. For example, during active delivery sprints, if a project requires 2 Infrastructure pipelines (Develop for Dev and SIT plus Master for UAT and Prod) and 2 Application pipelines (same as the Infrastructure), this is only $4 a month regardless of use and frequency. Jenkins on the other hand is most likely hosted on EC2 instances (or containers) so its cost depends on the number of instances, their size, their uptime in hours and the number of projects sharing Jenkins which is unlikely to bring it down to $4 a month per project unless the instance is small and/or shared with many other projects which raises the below risk.
  • In CodePipeline, each pipeline runs separately, meaning, any number of pipelines can build and deploy simultaneously and continuously without blocking each other. Plus, within each pipeline, a deployment can be happening while a new build is getting prepared. In Jenkins, your configurations determine the number of executors you can run i.e. the number of pipelines that can run at the same time. Also, you cannot start a new run of a particular pipeline before completing its previous run.
  • CodePipeline can run builds and tests through CodeBuild which is also a reliable and cost effective AWS managed service that does not require ongoing management and comes in many ready images but you can also create your own. Jenkins on the other hand requires setting up and integrating with build servers (or a local build environment).
  • CodePipeline integrates with AWS services seamlessly making deployments to CloudFormation, CodeDeploy, S3, ECS and Lambdas very secure and easy. Jenkins on the other hand requires setting up plugins and setting up the right AWS permissions to be able to deploy to AWS.

What if you need to trigger your pipeline from dynamic branches?

CodePipeline, as you might know, can only be configured with static branches e.g. you can set up a Develop pipeline to pick up from the develop branch in your source code but not from a dynamic branch such as feature/*. If you need this flexibility, well, check out Henrique Bueno’s cool solution!

But hey, if you’re on Bitbucket, you can also do that using Bitbucket Pipelines. Just note that if you want to deploy and/or push code to AWS from Bitbucket Pipelines, you will need to give your Bitbucket Pipelines programmatic access to AWS through Access Keys and/or SSH keys for CodeCommit. These permissions to deploy and manage your resources on AWS could introduce a vulnerability to your application as everyone in your organisation with access to your repo will have access to your AWS resources through the pipeline.

So, to solve this, restrict your Bitbucket Pipelines access to only s3:PutObject and/or codecommit:GitPush and let CodePipeline continue the orchestration. For example:

  • You can use Bitbucket Pipelines to push code to CodeCommit where CodePipeline can pick up the code and start an execution. See Jay Proulx’s post.
  • You can use Bitbucket Pipelines to run tests on commits to your feature/* and hotfix/* branches but use CodePipeline to run tests, builds and deployments from your fixed branches such as develop and master.
  • You can use Bitbucket Pipelines to run tests and builds on your feature/* (or even release/*) branches, then push the build output to a fixed S3 location where CodePipeline can pick up the package from and deploy to the relevant environments and applications.

Hot tip: if you’re uploading a package to S3 from Bitbucket Pipelines as suggested above, add codepipeline-artifact-revision-summary to the S3 object’s metadata and CodePipeline will show this Revision Summary in the execution. E.g. the below command pushes a package with Bitbucket’s Commit ID (an available variable) as the Revision Summary:

aws s3 cp --metadata codepipeline-artifact-revision-summary=${BITBUCKET_COMMIT} BackendPackage.zip s3://mypackagesbucket/Dev/BackendPackage.zip

Last but not least, if you must, you can use Jenkins Pipelines.

I’m not against it. If your applications require Jenkins, and you have the skills or team to manage it, go with it. Jenkins is very customisable, has a lot of plugins and support online, and you can manage its pipelines using version controlled text files - the same as you’d do with CodePipeline and Bitbucket Pipelines. But if you’re planning on using Jenkins, use it wisely. Plan and architect to your needs.

--

--