Continuous Integration (CI) is a development practice where developers integrate code into a shared repository frequently, usually several times a day. Each integration can then be verified by an automated build and automated tests. While several CI systems are available, is a popular option known for its ease of use and GitHub integration. When combined with AWS services, can offer a robust and scalable CI solution.

Setting Up with AWS

When setting up with AWS you can use any of our AWS workflows like the ECS Fargate workflow or the EKS EC2 ASG workflow.

  • Sign up for an AWS account if you don't have one.
  • Create an IAM user with programmatic access. This will give you an access key ID and a secret access key.

Create Secrets from Settings

  • Back in your dashboard, create your secrets by selecting Settings and Secrets.

Secrets are encrypted environment variables that utilizes within your workflow to build and run your application and deployments.

You will create four secrets:

  • To create your AWS SECRET KEY AND ACCESS KEY. Log into your AWS Account, select the Identity and Access Management (IAM) dashboard, create a new user, copy the Access Key ID and Secret access key, and paste it into your secret dashboard on
  • Your AWS ACCOUNT NUMBER can be obtained from your User ID on the top right corner of your AWS Console.

Generate Github Token

Generate your GITHUB_TOKEN from Github by going to Settings → Developer settings → Personal access tokens → Generate a new token on your Github profile.

  • Back in your Secrets dashboard, create your secret key/value pairs.

Next, your AWS credentials will be stored in your ops.yml file. The ops.yml file is a configuration file used by This file is written in YAML (Yet Another Markup Language) and is used to manage the automated building and testing of software projects within the environment.

Example ops.yml configuration

version: "1"
  - name: my-python-application-pipeline:1.0.0
    description: Build and push a Python Docker image to AWS ECR
        - PYTHON_VERSION=3.8
        - AWS_REGION=us-west-2
        - REPO=my-application
      - "github:my-org/my-repo:pull_request.merged"
      - "github:my-org/my-repo:pull_request.opened"
      - "github:my-org/my-repo:pull_request.synchronize"
      - name: python-docker-build-job
        description: Building Docker Image for Python Application
          - python3
          - docker
          - pip install awscli
          - aws ecr get-login-password --region $AWS_REGION | docker login --username AWS --password-stdin $AWS_ACCOUNT_NUMBER.dkr.ecr.$
          - docker build -t $REPO .
          - docker tag $REPO:latest $AWS_ACCOUNT_NUMBER.dkr.ecr.$$REPO:latest
          - docker push $AWS_ACCOUNT_NUMBER.dkr.ecr.$$REPO:latest

In the configuration above:


  • version: 1 specifies the version of the pipeline syntax used.


  • This section defines a list of pipelines. In this case, there's only one pipeline defined.

Pipeline Details:

  • name: The name of the pipeline, my-python-application-pipeline:1.0.0.
  • description: A brief description of what the pipeline does - building and pushing a Python Docker image to AWS ECR.


  • static: These are environment variables that are not sensitive. They include the Python version (PYTHON_VERSION=3.8), the AWS region (AWS_REGION=us-west-2), and the repository name (REPO=my-application).
  • secrets: These are sensitive environment variables that should be securely stored. They include AWS access key, secret access key, and account number.


  • This section defines the GitHub events that will trigger the pipeline. The pipeline is set to trigger on merged pull requests, newly opened pull requests, and synchronized pull requests in the specified repository.


  • A job named python-docker-build-job is defined here.
  • Job Details:
  • description: Describes what the job does - building a Docker image for a Python application.
  • packages: Specifies the packages needed for the job, which are Python 3 and Docker in this case.


  • These are the commands executed by the job.
  • pip install awscli: Installs the AWS Command Line Interface.
  • The next command logs into AWS ECR using the AWS CLI.
  • docker build -t $REPO .: Builds the Docker image and tags it with the repository name.
  • docker tag $REPO:latest $AWS_ACCOUNT_NUMBER.dkr.ecr.$$REPO:latest: Tags the Docker image for pushing to AWS ECR.
  • docker push $AWS_ACCOUNT_NUMBER.dkr.ecr.$$REPO:latest: Pushes the Docker image to AWS ECR.

The configuration file above automates the process of building, tagging, and deploying a Dockerized Python application to AWS ECR, triggered by specific events in a GitHub repository.


Integrating with AWS provides a scalable and efficient way to manage CI/CD pipelines. Using AWS with workflows, teams can automate their deployment processes, ensuring that new code changes are integrated and tested. Get started using workflows to ensure a scalable CI/CD process.