In a four-part series, we walk you through how to setup a secure, reliable, highly available, and cost effective CI pipeline.
As part of an Agile DevOps team, companies moving to the cloud will require continuous integration (CI) pipeline and code repository to efficiently manage, test and deploy applications.
Having the CI pipeline in the cloud has several advantages over an on-prem solution. I’ve listed a few below.
- Secure Universal Access – Global team can securely access the tools from anywhere in the world.
- Scalability and High Performance – Using advanced tools and features of the cloud provider such as autoscaling and load balancing, you can scale up or scale down based on demand. You only pay for the resources you need, when you need them. You can spin up new servers to run a task and spin back down when the task is complete – saving hours, days or weeks of productivity.
- Reliability – With the cloud you take advantage of the redundancy and reliability built into the hardware and software as well as having a 24x7x365 team of professionals keeping the platform running for the millions of customers using it.
- Flexibility – As technology advances, it very easy to swap out the underlying technology if you are running on a cloud platform. No need to purchase and provision new hardware in your data center when you want to use a faster processor, need faster storage or a faster network. Simply run a few commands and start to take advantage of the technology advancements provided by the cloud provider.
I’ll walk you through how to setup a secure, reliable, highly available, cost effective CI platform using concepts like infrastructure as code and immutable infrastructure.
When complete, we will have a fully functional, cloud-based DevOps platform which will allow a developer to check in code to a self-hosted GitLab cluster, build the application into a docker container using a Jenkins pipeline, and deploy the application to an Amazon Web Services (AWS) Elastic Container Service using Fargate.
The resulting PHP application will be publicly accessible via a URL hosted behind an AWS Application Load Balancer.
Read the Blog Series
In this four-part series I will guide you through what it takes to build a cloud-based CI platform using AWS.
Part 1 – Scripting a VPC Landing Zone using Terraform on AWS
I’ll talk about building the core infrastructure in AWS. We will refer to this as the “Landing Zone”, which will consist of components like VPCs, Subnets, Gateways and Routing Tables, Security and Identity Management, and more. Read the blog
Part 2 – Scripting GitLab and Jenkins Installs using Terraform on AWS
I get into how to script the deployment of a Jenkins master server, Jenkins slaves within an autoscaling group and a highly available GitLab behind a load balancer. Read the blog
Part 3 – Scripting a Container Platform using Fargate and ECR on AWS
I will expand the platform to include a container registry (ECR), and a serverless container platform where we can deploy our code in an Elastic Container Service (ECS) using the AWS Fargate service.
Part 4 – Deploying Docker Containers to AWS Fargate using Jenkins
I will focus on building scripts in Jenkins to build and deploy load-balanced, containerized applications to AWS. A key feature of this blog will show how to integrate Jenkins with AWS Elastic Container Registry, Elastic Container Service and Fargate.
It will check out our source code from GitLab, build and deploy a docker image to AWS ECR and create or update a Fargate task in ECS. The result will be a functioning application hosted on AWS via a public DNS Name (URL).
Here is what the platform will look like when we are complete: