How to Configure CloudWatch Logs for AWS ECR using CTO.ai Workflows
AWS CloudWatch provides robust monitoring solutions that enable you to track and analyze logs efficiently with CTO.ai. In this blog, we'll follow a step-by-step guide on configuring CloudWatch Logs for AWS ECR using CTO.ai workflows.
Prerequisites
- A functioning AWS account with appropriate permissions and IAM roles for AWS ECR.
- An AWS ECR repository.
- CTO.ai account and a project set up with repository access installed with the AWS Workflows
Set Up AWS Identity and Access Management (IAM) Roles and Policies
Before we begin, we need to ensure that the IAM role associated with your CTO.ai workflow has the necessary permissions to send logs to CloudWatch.
- Create a Policy with permissions to write logs to CloudWatch. Save it as
CloudWatchLogsPolicy.json
:
{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"logs:CreateLogGroup",
"logs:CreateLogStream",
"logs:PutLogEvents"
],
"Resource": "arn:aws:logs:*:*:*",
"Effect": "Allow"
}
]
}
- Attach the Policy to the IAM role used by your application stack.
Set up and Configure your CTO.ai Ops.yml
file
Next, we’ll create and configure our ops.yml
file for our ECR project. This file will be located in the directory where your ECR project is. In your CTO.ai project settings, add your AWS credentials (AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY) as environment variables.
Creating a CTO.ai Workflow to Push Images to ECR
Here, we are setting up a CTO.ai job that triggers every time there is a push to your repository, and it pushes logs to AWS CloudWatch. To integrate this setup with AWS ECR, include steps in your workflow to build and push Docker images to your ECR repository.
version: "1"
pipelines:
- name: sample-expressjs-pipeline-aws-ecs-fargate:0.1.1
description: build a release for deployment
env:
static:
- DEBIAN_FRONTEND=noninteractive
- ORG=workflows-sh
- REPO=sample-expressjs-aws-ecs-fargate
- AWS_REGION=us-west-1
- STACK_TYPE=aws-ecs-fargate
secrets:
- GITHUB_TOKEN
- AWS_ACCESS_KEY_ID
- AWS_SECRET_ACCESS_KEY
- AWS_ACCOUNT_NUMBER
events:
- "github:workflows-sh/sample-expressjs-aws-ecs-fargate:pull_request.merged"
- "github:workflows-sh/sample-expressjs-aws-ecs-fargate:pull_request.opened"
- "github:workflows-sh/sample-expressjs-aws-ecs-fargate:pull_request.synchronize"
jobs:
- name: sample-expressjs-build-job-aws-ecs-fargate
description: sample-expressjs build step
packages:
- git
- unzip
- python
steps:
- curl https://s3.amazonaws.com/aws-cli/awscli-bundle-1.18.200.zip -o awscli-bundle.zip
- unzip awscli-bundle.zip && ./awscli-bundle/install -b ~/bin/aws
- export PATH=~/bin:$PATH
- aws --version
- git clone https://$GITHUB_TOKEN:x-oauth-basic@github.com/$ORG/$REPO
- cd $REPO && ls -asl
- git fetch && git checkout $REF
- aws configure set aws_access_key_id $AWS_ACCESS_KEY_ID
- aws configure set aws_secret_access_key $AWS_SECRET_ACCESS_KEY
- aws configure set region $AWS_REGION
- DATE=$(date -u +%Y-%m-%dT%H:%M:%SZ)
- LOG_STREAM_NAME="CTO.ai-Logs-$DATE"
- aws ecr get-login-password --region $AWS_REGION | docker login --username AWS --password-stdin $AWS_ACCOUNT_NUMBER.dkr.ecr.$AWS_REGION.amazonaws.com/$REPO
- docker build -f Dockerfile -t $AWS_ACCOUNT_NUMBER.dkr.ecr.$AWS_REGION.amazonaws.com/$REPO-$STACK_TYPE:$REF .
- docker push $AWS_ACCOUNT_NUMBER.dkr.ecr.$AWS_REGION.amazonaws.com/$REPO-$STACK_TYPE:$REF
services:
- name: sample-expressjs-service-aws-ecs-fargate:0.1.1
description: A sample expressjs service
run: node /ops/index.js
port: [ '8080:8080' ]
sdk: off
domain: ""
env:
static:
- PORT=8080
events:
- "github:workflows-sh/sample-expressjs-aws-ecs-fargate:pull_request.merged"
- "github:workflows-sh/sample-expressjs-aws-ecs-fargate:pull_request.opened"
- "github:workflows-sh/sample-expressjs-aws-ecs-fargate:pull_request.synchronize"
trigger:
- build
- publish
- start
Ensure to replace my-app
and my-repo
with your application and repository names, respectively.
Back in your AWS Cloudwatch console, you can select your ECR data and start viewing all your metrics and logs coming from your ECR image.
Next, enter the following commands to create yoru container insights.
- aws logs create-log-group --log-group-name "CTO.ai-Logs"
- aws logs create-log-stream --log-group-name "CTO.ai-Logs" --log-stream-name "$LOG_STREAM_NAME"
- aws logs put-log-events --log-group-name "CTO.ai-Logs" --log-stream-name "$LOG_STREAM_NAME" --log-events timestamp=$(date +%s)000,message="Sample Log Message"
Conclusion
You've now successfully set up CloudWatch logging for AWS ECR using CTO.ai workflows. Now, each time your CTO.ai workflow runs, it will push logs to AWS CloudWatch, helping you monitor and maintain a seamless CI/CD pipeline with detailed logging for your AWS ECR.
Ready to unlock the power of CTO.ai for your team? Schedule your consultation now with one of our experts today!