How to Use CTO.ai Jobs for Simplified Google Cloud Platform Builds
CTO.ai is a widely used Continuous Integration and Continuous Deployment (CI/CD) platform. Google Cloud Platform (GCP), on the other hand, is a suite of cloud services that include hosting, computing, data storage, and machine learning capabilities. Integrating CTO.ai with GCP can help developers to streamline their build and deployment pipelines. In this article, we will explore how to use CTO.ai jobs for simplified builds on GCP.
Prerequisites
- An active CTO.ai account.
- A GCP project with admin permissions.
- The CTO.ai GCP workflow installed in your application
Step-by-step Guide:
Setting up CTO.ai
Before you get started, install and set up the CTO.ai GCP workflow from the GitHub repository if you don’t have access to it, reach out to our support team (support@cto.ai) to onboard you. When you are in, deploy the complete infrastructure using the ops run -b .
command.
Authenticating with GCP
Create a service account in GCP, and download the JSON key from your service account. When you are done, back in your application, create a file named ops.yml
. The ops.yml
file is where you configure and write your jobs and builds for your workloads. When your jobs are built, you can now deploy your ops.yml
file, which will trigger your builds from GCP.
Configuring CTO.ai Jobs for GCP Builds
CTO.ai jobs allow you to define build steps as containers. Below is an example configuration of using CTO.ai and GCP.
In CTO.ai, you can combine multiple jobs using workflows. For instance, you can set up a workflow that triggers your GCP build when an event trigger action is performed.
version: "1"
pipelines:
- name: sample-expressjs-pipeline-gcp-gke-pulumi-pyf:0.1.1
description: Build and Publish an image in a GCPContainer Registry
env:
static:
- DEBIAN_FRONTEND=noninteractive
- STACK_TYPE=gcp-gke-pulumi-py
- ORG=cto-ai
- GH_ORG=workflows-sh
- REPO=sample-expressjs-gcp-gke-pulumi-py
- BIN_LOCATION=/tmp/tools
secrets:
- GITHUB_TOKEN
- PULUMI_TOKEN
events:
- "github:workflows-sh/sample-gcp-gke-pulumi-py:pull_request.opened"
- "github:workflows-sh/sample-gcp-gke-pulumi-py:pull_request.synchronize"
- "github:workflows-sh/sample-gcp-gke-pulumi-py:pull_request.merged"
jobs:
- name: sample-expressjs-build-gcp-gke-pulumi-py
description: Build step for sample-expressjs-gcp-gke-pulumi-py
packages:
- git
- unzip
- wget
- tar
steps:
- mkdir -p $BIN_LOCATION
- export PATH=$PATH:$BIN_LOCATION
- ls -asl $BIN_LOCATION
- name: sample-expressjs-lint
description: Linting step for code quality
packages:
- npm
steps:
- npm install
- npm run lint
- name: sample-expressjs-security-scan
description: Security scanning step for vulnerabilities
steps:
- echo "Run your security tools here"
- name: sample-expressjs-test
description: Test step for sample-expressjs-gcp-gke-pulumi-py
packages:
- npm
steps:
- npm install
- npm test
- name: sample-expressjs-dockerize
description: Dockerize the application
packages:
- docker
steps:
- docker build -t $GH_ORG/$REPO:latest .
- name: sample-expressjs-push-image
description: Push Docker image to GCP's Container Registry
packages:
- docker
- gcloud
steps:
- echo $GCLOUD_SERVICE_KEY | gcloud auth activate-service-account --key-file=-
- docker push gcr.io/$GH_ORG/$REPO:latest
- name: sample-expressjs-deploy
description: Deploy step for sample-expressjs-gcp-gke-pulumi-py
packages:
- gcloud
- kubectl
steps:
- gcloud auth activate-service-account --key-file=$GCLOUD_SERVICE_KEY
- gcloud container clusters get-credentials YOUR_CLUSTER_NAME --zone YOUR_ZONE --project YOUR_PROJECT
- kubectl apply -f deployment.yaml
Structure
- Version:
1
: Specifies the version of the CTO CI integration. - Pipelines: The entire YAML file is structured in the pipeline
ops.yml
file. - Each pipeline has several attributes: name: The unique name of the pipeline.
- description: A brief description of what the pipeline does.
- env: Specifies environment variables. These can be static (pre-defined) or secrets (sensitive data).
- events: Specifies the events that should trigger the pipeline. In this case, it's set to trigger for GitHub events related to pull requests.
- jobs: A list of tasks or stages the pipeline should execute in order.
Environment variables
These are pre-defined variables that can be used throughout the jobs in the pipeline:
static:
Variables that are hard-coded and available for any job.secrets:
Sensitive data that you don't want to expose. They are fetched securely from CTO storage or other secret management tools.
Jobs:
Jobs are the individual tasks or stages of the pipeline.
Each job has:
name:
A unique name for the job.description:
A brief explanation of the job's purpose.packages:
Software or tools needed for the job.steps:
Sequential tasks or commands that should be executed in the job.
Job Breakdown:
sample-expressjs-build-gcp-gke-pulumi-py
: This job seems to set up some directories and tools. It uses packages like git, unzip, wget, and tar.sample-expressjs-lint
: A linting job to check the code quality. Assumes there's a linting script in the project's package.json.sample-expressjs-security-scan
: A placeholder job for security scanning. Actual tools or scripts for security scanning should replace the placeholder.sample-expressjs-test
: Tests the application. It assumes there are tests set up in the application that can be run via npm test.sample-expressjs-dockerize
: This job creates a Docker image from the application. Assumes there's a Dockerfile at the root of your project. sample-expressjs-push-image: Pushes the created Docker image to GCP's Container Registry.sample-expressjs-deploy
: Deploys the application to a Google Kubernetes Engine (GKE) cluster in Google Cloud Platform (GCP).
When you are done configuring your jobs in the ops.yml
file, push the changes to your GitHub repo, open or merge a pull request, and your pipelines will be triggered.
Conclusion
CTO.ai provides a robust platform for implementing CI/CD pipelines. By integrating CTO.ai with GCP, developers can automate and simplify their build and deployment processes, ensuring that software is reliably and efficiently delivered to users. With the steps provided in this article, one can set up a seamless integration between CTO.ai and GCP.