Introduction

Continuous Integration (CI) is a development practice where developers integrate code into a shared repository frequently, usually several times a day. Each integration can then be verified by an automated build and automated tests. While several CI systems are available, CTO.ai is a popular option known for its ease of use and GitHub integration. When combined with AWS services, CTO.ai can offer a robust and scalable CI solution.

Setting Up CTO.ai with AWS

When setting up CTO.ai with AWS you can use any of our AWS workflows like the ECS Fargate workflow or the EKS EC2 ASG workflow.

  • Sign up for an AWS account if you don't have one.
  • Create an IAM user with programmatic access. This will give you an access key ID and a secret access key.

Create Secrets from Settings

  • Back in your CTO.ai dashboard, create your secrets by selecting Settings and Secrets.

Secrets are encrypted environment variables that CTO.ai utilizes within your workflow to build and run your application and deployments.

You will create four secrets:

  • AWS_ACCESS_KEY_ID
  • AWS_SECRET_ACCESS_KEY
  • AWS_ACCOUNT_NUMBER
  • GITHUB_TOKEN
  • To create your AWS SECRET KEY AND ACCESS KEY. Log into your AWS Account, select the Identity and Access Management (IAM) dashboard, create a new user, copy the Access Key ID and Secret access key, and paste it into your secret dashboard on CTO.ai.
  • Your AWS ACCOUNT NUMBER can be obtained from your User ID on the top right corner of your AWS Console.

Generate Github Token

Generate your GITHUB_TOKEN from Github by going to Settings → Developer settings → Personal access tokens → Generate a new token on your Github profile.

  • Back in your CTO.ai Secrets dashboard, create your secret key/value pairs.

Next, your AWS credentials will be stored in your ops.yml file. The ops.yml file is a configuration file used by CTO.ai. This file is written in YAML (Yet Another Markup Language) and is used to manage the automated building and testing of software projects within the CTO.ai environment.

Example ops.yml configuration

version: "1"
pipelines:
  - name: my-python-application-pipeline:1.0.0
    description: Build and push a Python Docker image to AWS ECR
    env:
      static:
        - PYTHON_VERSION=3.8
        - AWS_REGION=us-west-2
        - REPO=my-application
      secrets:
        - AWS_ACCESS_KEY_ID
        - AWS_SECRET_ACCESS_KEY
        - AWS_ACCOUNT_NUMBER
    events:
      - "github:my-org/my-repo:pull_request.merged"
      - "github:my-org/my-repo:pull_request.opened"
      - "github:my-org/my-repo:pull_request.synchronize"
    jobs:
      - name: python-docker-build-job
        description: Building Docker Image for Python Application
        packages:
          - python3
          - docker
        steps:
          - pip install awscli
          - aws ecr get-login-password --region $AWS_REGION | docker login --username AWS --password-stdin $AWS_ACCOUNT_NUMBER.dkr.ecr.$AWS_REGION.amazonaws.com
          - docker build -t $REPO .
          - docker tag $REPO:latest $AWS_ACCOUNT_NUMBER.dkr.ecr.$AWS_REGION.amazonaws.com/$REPO:latest
          - docker push $AWS_ACCOUNT_NUMBER.dkr.ecr.$AWS_REGION.amazonaws.com/$REPO:latest

In the configuration above:

Version:

  • version: 1 specifies the version of the pipeline syntax used.

Pipelines:

  • This section defines a list of pipelines. In this case, there's only one pipeline defined.

Pipeline Details:

  • name: The name of the pipeline, my-python-application-pipeline:1.0.0.
  • description: A brief description of what the pipeline does - building and pushing a Python Docker image to AWS ECR.

env:

  • static: These are environment variables that are not sensitive. They include the Python version (PYTHON_VERSION=3.8), the AWS region (AWS_REGION=us-west-2), and the repository name (REPO=my-application).
  • secrets: These are sensitive environment variables that should be securely stored. They include AWS access key, secret access key, and account number.

Events:

  • This section defines the GitHub events that will trigger the pipeline. The pipeline is set to trigger on merged pull requests, newly opened pull requests, and synchronized pull requests in the specified repository.

Jobs:

  • A job named python-docker-build-job is defined here.
  • Job Details:
  • description: Describes what the job does - building a Docker image for a Python application.
  • packages: Specifies the packages needed for the job, which are Python 3 and Docker in this case.

steps:

  • These are the commands executed by the job.
  • pip install awscli: Installs the AWS Command Line Interface.
  • The next command logs into AWS ECR using the AWS CLI.
  • docker build -t $REPO .: Builds the Docker image and tags it with the repository name.
  • docker tag $REPO:latest $AWS_ACCOUNT_NUMBER.dkr.ecr.$AWS_REGION.amazonaws.com/$REPO:latest: Tags the Docker image for pushing to AWS ECR.
  • docker push $AWS_ACCOUNT_NUMBER.dkr.ecr.$AWS_REGION.amazonaws.com/$REPO:latest: Pushes the Docker image to AWS ECR.

The configuration file above automates the process of building, tagging, and deploying a Dockerized Python application to AWS ECR, triggered by specific events in a GitHub repository.


Conclusion

Integrating CTO.ai with AWS provides a scalable and efficient way to manage CI/CD pipelines. Using AWS with CTO.ai workflows, teams can automate their deployment processes, ensuring that new code changes are integrated and tested. Get started using CTO.ai workflows to ensure a scalable CI/CD process.