How to Set Up a CI/CD Pipeline with Express JS and CTO.ai

CTO.ai pipelines and deployment process enables you to configure and build powerful CI/CD capabilities inside your application source code and lets you quickly test your pipeline and implement a build, run, and publish workflow for your application needs. With our CI/CD pipeline configurations, you can use and reuse your pipelines across multiple languages and runtime within the same workflow.

Prerequisites

Here is a quick list of what we’ll accomplish in this post:

  • Configure your ops.yml file and build a CI/CD Pipeline on your Javascript app
  • Write a Sample Express JS App

Setting up CTO.ai Account

  1. Sign up and log in to your account on CTO.ai

2. Create a sample Node-js REST API App.

const express = require('express');


const app = express()


app.get('/', (req, res) => res.send('workflow app'));


app.listen(3000, () => {
   console.log('running on port 3000!');
}
)

3. Create a Dockerfile, specify the base image, WORKDIR, and your commands.

FROM registry.cto.ai/official_images/node:2-12.13.1-stretch-slim


WORKDIR /ops


COPY package.json ./


RUN npm install


COPY . ./ops


ADD . .


EXPOSE 3000


CMD ["npm", "start"]

4. Next, sign in to your account and project using the ops account:signin command.

5. Initialize a new pipeline for your javascript application using the ops init command and select pipeline.

  • The command will create the ops.yml file in the directory. In the ops.yml file you can edit, configure, and define your pipeline steps for your application.
pipelines:
 - name: sample-app-expressjs-pipeline:0.1.0
   description:
   env:
     static:
       - DEBIAN_FRONTEND=noninteractive
     secrets:
       - GITHUB_TOKEN
       - GITHUB_USERNAME
       - AWS_ACCESS_KEY
       - AWS_SECRET_KEY
       - AWS_ACCOUNT_NUMBER
   events:
     - "github:<github_org>/<github_repo>:pull_request.merged"
     - "github:<github_org>/<github_repo>:create.tag"
   jobs:
     - name: sample-app-expressjs-pipeline-build-job
       description: example build step
       packages:
         - git
         - unzip
         - python
       steps:
         - curl https://s3.amazonaws.com/aws-cli/awscli-bundle-1.18.200.zip -o awscli-bundle.zip
         - unzip awscli-bundle.zip && ./awscli-bundle/install -b ~/bin/aws
         - export PATH=~/bin:$PATH
         - aws --version
         - git clone https://$GITHUB_TOKEN:x-oauth-basic@github.com/$ORG/$REPO
         - cd $REPO && ls -asl
         - aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin $AWS_ACCOUNT_NUMBER.dkr.ecr.us-east-1.amazonaws.com/$REPO
         - docker build -t $AWS_ACCOUNT_NUMBER.dkr.ecr.us-east-1.amazonaws.com/$REPO:$REF .
         - docker push $AWS_ACCOUNT_NUMBER.dkr.ecr.us-east-1.amazonaws.com/$REPO:$REF
  • Before you can configure your ops.yaml file you need to pass and save your secrets on the CTO.ai dashboard. The secrets we’re saving are the GITHUB_TOKEN AND GITHUB_USERNAME
  • In the events section, you can add workflow events that will get triggered when an activity is met like when I open a pull request, create a tag on my repository or merge a pull request.
  • Specify your pipeline steps and command in the steps section under jobs. The jobs in your ops.yml file will orchestrate your workflows, run the executable commands you define, and build your entire application. Here in my ops.yaml file i want my application to download the aws cli, build my javascript application using docker, and push the docker container to the registry.

6. Next, when you’re done configuring the application, build your javascript application using the ops build . command. The ops build . command will build the Dockerfile and ops.yml files specified in the path from the source code.

7. After building your application, you can run it using the ops run . command. This command will create the running container from images and run the commands inside.

8. Next, publish your workflow using the ops publish . command. The ops publish . command will share and push your local docker image to your registry on CTO.ai.

9. Next, back in your GitHub repo, trigger your events by creating a pull request or merging your pull request.

10. Back in your CTO.ai dashboard, you’ll see your pipeline will start running the jobs in your CTO.ai dashboard.

  • When you click on it, you can see all the details like RUN ID, VERSION, LAST RUN, STATUS, and the pipeline ACTION.

  • Select your RUN ID, and you’ll see all your Pipeline logs. You can easily elaborate and expand the amount of information captured from your application and know if any error occurred during the release process.

Is there more?

There's always more! There are so many things that you can do with Pipelines on CTO.ai. Here are some useful docs for how to take full advantage of them:

Pipelines are easy to set up and integrate; they speed up your application development and make your code more reliable by creating a standardized and secure software delivery pipeline.