CI/CD has cemented its place as a top-tier practice. It emphasizes the principle of frequent, automated deployments, ensuring that software can be reliably released at any time. When combined with Kubernetes, a powerful system for managing containerized applications, developers can achieve high levels of efficiency and automation.

In this tutorial, we'll explore how to set up a CI/CD pipeline using, a cloud-based CI/CD service, and deploy an application to a Kubernetes cluster. By the end, you’ll have a grasp of the fundamentals and a working example to guide your own implementations.


  • Basic knowledge of Git & GitHub.
  • A Account.
  • Basic understanding of Docker and Kubernetes.
  • A Kubernetes cluster ready (e.g., Minikube, EKS, GKE, AKS, Managed DigitalOcean Kubernetes)

Setting Up the Project Repository

First, you'll need a code repository. For the sake of this tutorial, we'll use a simple Node.js application.

  • Start by creating a new repository on GitHub.
  • Clone the repository to your local machine and add your Node.js app.
  • Make sure your application includes a Dockerfile, which is essential for creating a Docker image.

In this tutorial, we have a sample repository you can use to test your code changes, which uses the managed DigitalOcean Kubernetes workflow. Remember, the repository will be the cornerstone of your CI/CD process, with triggering actions upon new code changes. Set up the DigitalOcean Kubernetes Infrastructure, which will set up and deploy your DigitalOcean resources and dependencies for you with just a click without worrying about the underlying infrastructure.

You can clone the repository by:

git clone

cd do-k8s-cdktf


  • After deploying the DigitalOcean infrastructure, log in to your CTOai account and add your secrets to, pointing to your GitHub repository.

  • Create an ops.yml file at the root of your repository; if it doesn’t exist already, add the ops.yml file inside your directory. Here’s a basic example of what the ops.yml file will include: You can also see how the ops.yml file includes in the repository here, and you can clone it and modify it for your own build:

Here’s a basic example of what the ops.yml file might include:

version: "1"
  - name: sample-expressjs-pipeline-do-k8s-cdktf:0.2.5
    description: Build and Publish an image in a DigitalOcean Container Registry
        - DEBIAN_FRONTEND=noninteractive
        - STACK_TYPE=do-k8s-cdktf
        - ORG=cto-ai
        - GH_ORG=workflows-sh
        - REPO=sample-expressjs-do-k8s-cdktf
        - BIN_LOCATION=/tmp/tools
        - GITHUB_TOKEN
        - DO_TOKEN
      - "github:workflows-sh/sample-expressjs-do-k8s-cdktf:pull_request.merged"
      - name: sample-expressjs-build-do-k8s-cdktf
        description: Build step for sample-expressjs-do-k8s-cdktf
          - git
          - unzip
          - wget
          - tar
          - mkdir -p $BIN_LOCATION
          - wget $DOCTL_DL_URL -O doctl.tar.gz
          - tar xf doctl.tar.gz -C $BIN_LOCATION
          - doctl version
          - git version
          - docker build -f Dockerfile -t one-img-to-rule-them-all:latest .
          - docker tag one-img-to-rule-them-all:latest$ORG/$REPO:$CLEAN_REF
          - docker push$ORG/$REPO:$CLEAN_REF
  - name: sample-expressjs-service-do-k8s-cdktf:0.1.6
    description: Preview of image built by the pipeline
    run: node /ops/index.js
    port: [ '8080:8080' ]
    sdk: off
    domain: ""
        - PORT=8080
      - "github:workflows-sh/sample-expressjs-do-k8s-cdktf:pull_request.opened"
      - "github:workflows-sh/sample-expressjs-do-k8s-cdktf:pull_request.merged"
      - build
      - publish
      - start

This YAML configuration is used to set up a CI/CD pipeline specifically for a sample Express.js application, leveraging certain features of DigitalOcean's services, particularly the DigitalOcean Container Registry. Let's break down the components and understand what each part of the configuration does:

Version: Specifies the version of the configuration syntax, which is "1" in this case.

Pipelines: The main part of the configuration where the different pipelines for your CI/CD process are defined.

  • name: Unique identifier for the pipeline, which seems to follow a specific naming convention, probably for clarity and uniformity across different projects within the organization.
  • env: Specifies environment variables for the pipeline. There are two types here:
  • static: These are static environment variables that are set for the pipeline's environment.
  • secrets: These are sensitive values that are stored securely. For instance, GITHUB_TOKEN and DO_TOKEN are necessary for authentication purposes with GitHub and DigitalOcean, respectively.
  • events: Specifies the events that trigger the pipeline. In this case, it's GitHub events related to pull requests.
  • jobs: Defines the jobs to be run within the pipeline. There's a single job here with multiple steps defined, such as setting up tools, cloning the repository, building the Docker image, and pushing it to DigitalOcean's container registry.

Services: This section seems to be specific to the platform being used (which is not directly mentioned in the YAML but seems like a platform that supports CI/CD pipelines as code). It defines the services that should be run as part of the CI/CD process.

  • name: Similar to the pipeline, this is a unique identifier for the service.
  • description: A brief about what the service does.
  • run: Specifies the command to start the service, which in this context, is a Node.js application.
  • port: Maps the host port to the container port; necessary for accessing the service.
  • env: Environment variables specific to the service.
  • events: Similar to the pipeline, specifies the events that trigger the service.trigger: Defines the stages of the pipeline that should trigger this service. In this case, it's the build, publish, and start stages.

The main purpose of this configuration is to automate the process of testing, building, and deploying an Express.js application in response to specific GitHub pull request events. It does so by defining a detailed workflow for handling the code, building it into a Docker container, and pushing it to a registry, presumably for subsequent deployment on a Kubernetes cluster managed by DigitalOcean.

Trigger Pipelines with Event Triggers

When you’re done configuring your ops.yml file, you can trigger your Pipelines using events on your GitHub repository. We configured our CI/CD Pipelines to run when a pull request is opened or merged.

In your GitHub repository, make some changes and create a pull request. Once you open your pull request, you’ll see that your CI/CD Pipelines will start building and deploying your DigitalOcean resources to

  • Back in your Pipeline dashboard, you can see your Pipeline logs with a detailed overview of your VERSION, RUN ID, TAG, AND STATUS.

  • View your Pipeline logs to track performance and access your test results with the job output on every deployed DigitalOcean resource.
  • You can combine other events like pull_request.merged , create.tag , pull_request.closed to run and trigger your CI/CD Pipeline. For example, when you create a tag on your repository like this, your CI/CD Pipeline will trigger and deploy your resources from DigitalOcean.


You've just navigated through creating a CI/CD pipeline using CTOai and Managed Kubernetes on DigitalOcean! This integration is just a starting point. Both CTOai and Kubernetes offer many more features for complex workflows and deployment strategies. Keep exploring and improving your pipeline for optimal reliability and efficiency. Happy coding!