What is Elastic Container Service?

ECS is a managed service to run containers using Docker. ECS sits on top of Docker as it allows you to launch, set up, and monitor your Docker containers on your ECS cluster.

In this tutorial, we are going to learn how to Dockerize an application and deploy it to AWS Elastic Container Service.


We are going to be Dockerizing an Express api and deploying it to our machine. Express is a minimal and flexible Node.js web application framework that provides a robust set of features for web and mobile applications.

Express is built on top of Node, however, you don’t need to have node installed on your local machine because we are working with Docker and the containers will be isolated from our local machine.



We have the source code for our Express API on our GitHub Repo, you can make copy and restructure the code-base to suit your workflow.


Step 1: Create an index.js file and paste the code below in your code editor.

const express = require("express");

const app = express();

app.get("/", (req, res) => {
  res.send("This is my express app");

app.get("/me", (req, res) => {
  res.send("Hi I am Tola");

app.listen(5000, () => {
  • Whenever you import a module like const express = require('express'), express is a module with functions or objects or variables assigned to it.
  • require(express) returns a function reference. That function is called with express(), and app is an object returned by express().
  • app. get() is a function that tells the server what to do when a get request at the given route is called. It has a callback function (req, res) that listens to the incoming request (req object) and responds accordingly using (res) response object. Both req and res are made available to us by the Express framework.
  • We have our endpoints, /, /me.  The / endpoint with the This is my express app response, the /me endpoint has the response of Hi i’m Tola

Step 2: After creating your index.js file, copy and paste the package.json file. This shows you all the dependencies that your application needs.


Step 3: Let’s go ahead and Dockerize our application by creating a Dockerfile.

To create your Dockerfile in your editor, create a new file and type Dockerfile.

When creating your file:

  • Be sure to specify a base image as we don’t want to be working with a completely empty image. From node:alpine is our base image.
  • The next step is to specify your working directory. The working directory is the directory where all of our files will be stored and run inside of our container.
  • WORKDIR /app our working directory will be inside of an app directory.
  • Next, copy some of your code into your working directory. Start by copying the package.json using COPY package.json. The dot means it is going to copy the package.json and move it to the working directory which is going to be inside of the app directory.

  • Now that you have your package.json, install all of your dependencies using RUN npm install and copy everything inside your Dockerfile directory to the app directory using COPY . .  This copies your index.js file as well.
  • We are running on localhost 5000. In your Dockerfile, expose your localhost using EXPOSE 5000 and execute your node and index.js command using CMD[“node”, “index.js”].

Step 4: Go ahead and build your image.

For you to build your Docker image, open your terminal or CLI and run docker build -t express-app. Your images will now begin to build by layer.

  • View your image ID using docker images. Copy your docker ID to run your application.

Step 5: Run your application using the docker run command docker run -p 8080:5000 70846e60c430.

The application will run on localhost 5000 as a result of when we specified port 5000 for our application earlier.

You can see that our application is listening.

  • Visit your application url localhost:8080 in your browser to confirm changes.
  • If you change the endpoint in your address bar you will get the specific response.

Step 6: Next, let’s get our Docker application running on AWS. In your AWS console, search for ECS (Elastic Container Service).


Step 7: Before we deploy our application to ECS, we need to deploy our image into AWS using Amazon ECR.

This is the container registry where we register all of our images. Amazon ECR is similar to Docker Hub where we store our images there.

  • Head over to Repository where we are going to create a new repository. Click on Create Repository.
  • Select Public and give it a name like express-app.
  • When you are done click on Create Repository. You can see that our Repository is now created.

Once our image is created, we can now push it to the express-app repo.


Step 8: Back in your Amazon ECR repository, select the express-app repository we just created and click on View push commands.

The push commands will show you all of the steps you need to run to push your local docker image into our repository we created.

  • Authenticate your docker client to your registry by copying the first push command and paste it into your terminal where your application source code is located.
  • Copy the second push command to build your image.
  • Tag your image so you can push it to the repository using the tag command docker tag express-app:latest public.ecr.aws/c6k2k2p8/express-app:latest
  • Push your image to your repository. You can see below that it’s starting to do so.
  • Back in your ECR console, you will see your application image with the latest image tag.

Step 9: Now that our image is in ECR, head over to ECS to create your cluster.

ECS will set up an EC2 instance, and a network for your application. Click on Amazon ECS and select Create Cluster.

  • And choose the EC2 Linux + Networking Server.
  • Click on Next Step, and configure your instance with your cluster name, instance type, number of instances, key pair, and volume.
  • Auto-assign public IP should also be enabled.
  • When you are done, click on Create. The process will create your EC2 instance and connect it to the network we just configured.

Step 10: Visit your EC2 console, you will see your EC2 instance running.


Step 11: Back in ECS, you will see your cluster. Click on it and then select Create new task definition.

  • For Step 1, select EC2.
  • Enter the configuration for your Task and also make sure the Task Memory is 100 and Task CPU is 1 vCPU.

Enter the name of your container and your image URL. Your image URL is taken from your Amazon ECR repo in step 8

  • Specify your container port to be 5000 and Host Port is 8080. In the code we wrote earlier, port 5000 was the port internal to our express application and port 8080 was the port we exposed.
  • When you are done click on Add and create your Task Definition.

Step 12: Back in your Express App Cluster, select Tasks and Click on Run new Task.

  • Specify EC2 as your Launch Type, then click on Run Task.
  • You can see that our Task is now running. Once it starts running, we should be able to access it.

Step 13: To access our App, go back to your EC2 dashboard, select the security group of your running EC2 instance and edit the inbound rules.

Add two new rules on port 8080, your source should be from anywhere and save the rules.


Step 14: Go back to your EC2 dashboard, copy your Public IPv4 DNS address and paste it in your browser.

  • You can see that your Express Application is now deployed and running and you can start making HTTP requests to the server.
  • When you are done with your deployment and application, back in your ECS console, you can delete your cluster by selecting Delete Cluster in the dashboard.

ECS runs and manages Docker-Enabled applications across a logical group of Amazon EC2 Instances. ECS keeps track of your instances and how many resources they have and what they are running. You can make requests to ECS, manage communication between Docker Daemon on your EC2 Instance and deploy containers onto EC2 instances.

Provisioning a workflow for auto-scaling your AWS ECS container service, creating customized CI/CD for build, test, and release environment using CTO.ai Command-based workflow and Delivery Insights will deliver speed and agility for your business. Our Infrastructure will also help you manage and upgrade your orchestration systems effectively without downtime.

If you have enjoyed reading this, be sure to check out our other blog posts:


Stay up to date by joining the CTO.ai Community