Workflows Overview
Using CTO.ai Commands, Pipelines, and Services, you can build containerized workflows which define how your team can manage your cloud application infrastructure interactively.
From simple commands which simply use the CLI of another tool, to complex multi-step, multi-cloud development processes, our different types of workflows can be used to build a cloud application platform that meets your needs. Commands can be integrated with each other and can be used to manage Pipelines and Services workflow runs, giving you significantly flexibility in how you translate your DevOps playbooks to run on our platform.
We currently offer Workflow SDKs with support for four different runtimes:
- Python
- Node
- Bash
- Go
Our SDKs provide a straightforward way to run automated processes, send Lifecycle Event data to your CTO.ai Dashboard, prompt users interactively (when using Commands), and integrate with other systems.
Anatomy of a Workflow
There are three main components to workflows of every kind:
ops.yml
configuration fileDockerfile
(and a corresponding.dockerignore
file)- Your custom code
Create a Workflow from a Template (ops init)
The CTO.ai ops
CLI provides a subcommand to create scaffolding files for your workflow: ops init
.
If you run ops init
without any arguments, you will be interactively prompted for a name, the type of workflow you wish to create, a description, and a version number. This information will be used to generate scaffolding code—containing a basic template for your workflow—in a subdirectory of your current path:
Workflow Scaffolding
As you can see from the example above, when you create a workflow using ops init
(in this case, to create a Python-based Commands workflow), it also adds the basic files you might expect when building an application in the selected language. Here is how all of these files work together:
- The
main.py
file contains minimal demo code which prompts the user for input and sends a Lifecycle Event to the CTO.ai platform, both using our SDK. - The included
Dockerfile
installs your workflow’s dependencies using therequirements.txt
file; in general, theDockerfile
is where the runtime environment for your workflow is defined. - The
ops.yml
file is used to configure how the workflow is actually run by our CLI and cloud platform; each workflow defined in anops.yml
file will specify the container that should be used and the code that should be run within.
Each of our templates includes the basic scaffolding you need to start building a workflow in the language of your choice. The files you can expect from each template are listed in the table below:
Common Scaffold Files | ops.yml | Dockerfile | .dockerignore |
---|
Bash | Python | JavaScript | Go |
---|---|---|---|
|
|
|
|
Scaffolding for Pipelines and Jobs
Please note that when you create a Pipelines workflow, only an ops.yml
file will be created. In any ops.yml
file, after you have defined one or more Pipelines Jobs (regardless of the type of template originally used), you can run ops init . -j
in the directory containing your ops.yml
file to generate scaffolding code for each. The scaffolding code for each Job is created in a subdirectory of ./.ops/jobs/
, where the name of each subdirectory matches the name of a Job in your ops.yml
file.