Introduction

In this tutorial, we explore CTO.ai, a robust CI/CD service known for its speed and user-friendly interface. Starting with simple scripts, we'll advance to intricate, multi-stage workflows.

Set Up CTO.ai

  • Install the Ops CLI by running the npm install -g @cto.ai/ops in your local machine.

    The CTO.ai ops CLI acts as the integration point between your command line workflows and the CTO.ai platform, providing a framework for implementing ChatOps-based workflows that integrate with your existing processes. It allows you to build, publish, and run Commands, Pipelines, and Services workflows from the comfort of your terminal, manage your Configs and Secrets stores, and invite members to your CTO.ai team.
  • Log in to your account using ops account:signin Running this command triggers a browser-based signin flow that allows you to log in to your account as your would on the web, as well as sign in with OAuth providers like GitHub.
  • In your GitHub repository, create a Config File called ops.yml: This file contains CTO.ai's instructions, steps, and commands for your project.

Simple Pipeline Configuration

When configuring your first CTO.ai Pipeline job.

version: “1”
Pipelines:
  name: Hello World Pipeline
  description: Build and create a hello world pipeline
  env:
    static:
      - STACK_TYPE=do-k8s-cdktf
      - GH_ORG=workflows-sh
      - REPO=samplejs-do-k8s-cdktf
  secrets:
    - GITHUB_TOKEN
  events:
    - "github:workflows-sh/sample-expressjs-do-k8s-cdktf:pull_request.opened"
  jobs:
    - name: sample-expressjs-build-do-k8s-cdktf
      description: Build step for sample-expressjs-do-k8s-cdktf
      packages:
        - git
      steps:
        - echo “Hello World!”

This configuration sets up a basic job to echo "Hello, World!"

Expanding Your Pipeline

Integrating Tests

Here’s how to expand the pipeline to include tests:

version: "1"
pipelines:
  - name: run-unit-tests-pipeline
    description: Pipeline to run unit tests
    jobs:
      - name: unit-tests
        description: Run Unit Tests
        steps:
          - checkout
          - make test

This block incorporates steps to checkout code and run unit tests.

Creating Complex Workflows

A more complex workflow with multiple stages, and multiple jobs.

version: "1"
pipelines:
  - name: sample-expressjs-pipeline-aws-ecs-fargate:0.1.2
    description: build a release for deployment
    env:
      static:
        - DEBIAN_FRONTEND=noninteractive
        - ORG=workflows-sh
        - REPO=sample-expressjs-aws-ecs-fargate
        - AWS_REGION=us-west-1
        - STACK_TYPE=aws-ecs-fargate
      secrets:
        - GITHUB_TOKEN
        - AWS_ACCESS_KEY_ID
        - AWS_SECRET_ACCESS_KEY
        - AWS_ACCOUNT_NUMBER
    events:
      - "github:workflows-sh/sample-expressjs-aws-ecs-fargate:pull_request.merged"
      - "github:workflows-sh/sample-expressjs-aws-ecs-fargate:pull_request.opened"
      - "github:workflows-sh/sample-expressjs-aws-ecs-fargate:pull_request.synchronize"
    jobs:
      - name: sample-expressjs-build-job-aws-ecs-fargate
        description: sample-expressjs build step
        packages:
          - git
          - unzip
          - python
        steps:
          - curl https://s3.amazonaws.com/aws-cli/awscli-bundle-1.18.200.zip -o awscli-bundle.zip
          - unzip awscli-bundle.zip && ./awscli-bundle/install -b ~/bin/aws
          - export PATH=~/bin:$PATH
          - aws --version
          - git clone https://oauth2:[email protected]/$ORG/$REPO
          - cd $REPO && ls -asl
          - git fetch && git checkout $REF
          - aws ecr get-login-password --region $AWS_REGION | docker login --username AWS --password-stdin $AWS_ACCOUNT_NUMBER.dkr.ecr.$AWS_REGION.amazonaws.com/$REPO
          - docker build -f Dockerfile -t $AWS_ACCOUNT_NUMBER.dkr.ecr.$AWS_REGION.amazonaws.com/$REPO-$STACK_TYPE:$REF .
          - docker push $AWS_ACCOUNT_NUMBER.dkr.ecr.$AWS_REGION.amazonaws.com/$REPO-$STACK_TYPE:$REF
services:
  - name: sample-expressjs-service-aws-ecs-fargate:0.1.1
    description: A sample expressjs service
    run: node /ops/index.js
    port: [ '8080:8080' ]
    sdk: off
    domain: ""
    env:
      static:
        - PORT=8080
    events:
      - "github:workflows-sh/sample-expressjs-aws-ecs-fargate:pull_request.merged"
      - "github:workflows-sh/sample-expressjs-aws-ecs-fargate:pull_request.opened"
      - "github:workflows-sh/sample-expressjs-aws-ecs-fargate:pull_request.synchronize"
    trigger:
     - build
     - publish
     - start

Pipeline Configuration

  • name: The unique identifier for the pipeline.
  • description: A brief description of what the pipeline does.
  • env: Specifies environment variables:
  • static: Fixed environment variables like DEBIAN_FRONTEND (to prevent interactive prompts during installations), organizational identifiers ( ORG, REPO), AWS region ( AWS_REGION), and the stack type (  STACK_TYPE).
  • secrets: Sensitive data like GITHUB_TOKEN, AWS credentials, and account number.
  • events: Triggers for the pipeline. This pipeline activates on certain GitHub events related to pull requests (  merged, opened,  synchronize) in the specified repository.
  • jobs: The tasks the pipeline should execute. This includes:
  • name and description: Identifiers and descriptions of the job.
  • packages: Software or tools to be installed, like git, unzip, and python.
  • steps: A series of commands executed by the job. These include installing AWS CLI, cloning the specified GitHub repository, checking out the relevant branch ( $REF), logging into Amazon Elastic Container Registry (ECR) using AWS CLI, building a Docker image, and pushing it to ECR.

Conclusion

CTO.ai's adaptability allows it to manage a range of CI/CD needs, from simple script executions to complex, multi-stage workflows. This tutorial lays a foundation for developing scalable pipelines in CTO.ai. Get started for free with CTO.ai workflows.