Multi-region deployment is the new frontier for businesses aiming to achieve global reach while ensuring low latency and high availability. Google Cloud Platform (GCP) offers the infrastructure to facilitate such deployments, and when coupled with, the process becomes seamless.

This article will guide you through deploying multi-region applications in GCP using


Set Up Google Cloud Service Account

Create a service account on GCP to give access:

  • Go to GCP console > IAM & admin > Service accounts.
  • Click Create Service Account
  • Add relevant permissions for your Compute Engine.

Configure and Set up GCP Workflows

Before we get started with this guide, install the GCP GKE Pulumi Py workflow from. If you don't have access to the repository, kindly contact us at [email protected] The repo includes a complete IaC for deploying infrastructure over GCP: Kubernetes, Container Registry, Database Clusters, Load Balancers, and Project Resource Management, all built using Python + Pulumi +

Clone the repository with:

git clone “” 

cd gcp-gke-pulumi-py

Run and Set up your Infrastructure

Next, you need to build and set up the infrastructure that will deploy each resource to GCP using the GCP workflow stack. Set up your infrastructure using the ops run -b . This will provision your stack and set up your infrastructure.

  • Select setup infrastructure over GCP
  • This process will build your Docker image and start provisioning your GCP infra resources.


  • Next, select the services you want to deploy from the CLI. We will select the “all” service and install all the dependencies, which will also provision our GCP container registry.
  • Back in the GCP console, click on your container registry, and you will see your Container Registry created for usage.
  • When your resources are deployed and your infra is created, you can view your VM instances, Database, GKE, and other resources you will use in your GCP console.
  • We can now see our machine configuration.
  • Back in your GCP console, you can also see your GKE cluster.

  • When you click on it, you can see the Machine configuration and Network configs

Setting Up Your Environment

Configure GCP Service Account

To grant access to GCP.

  • Go to the GCP Console > IAM & Admin > Service Accounts.
  • Create a service account, and grant it the necessary permissions for App Engine and Compute Engine.
  • Download the JSON key for this service account; the credentials will be stored in Secrets and referenced in your ops.yml file.

Configuring  for Multi-region Deployment

Your ops.yml should have the necessary configurations for multi-region deployment:

version: "1"
  - name: sample-expressjs-pipeline-gcp-gke-pulumi-py:0.1.1
    description: Build and Publish an image in a GCP Container Registry
        - DEBIAN_FRONTEND=noninteractive
        - STACK_TYPE=gcp-gke-pulumi-py
        - ORG=cto-ai
        - GH_ORG=workflows-sh
        - REPO=sample-expressjs-gcp-gke-pulumi-py
        - BIN_LOCATION=/tmp/tools
        - GITHUB_TOKEN
        - PULUMI_TOKEN
      - "github:workflows-sh/sample-gcp-gke-pulumi-py:pull_request.opened"
      - "github:workflows-sh/sample-gcp-gke-pulumi-py:pull_request.synchronize"
      - "github:workflows-sh/sample-gcp-gke-pulumi-py:pull_request.merged"
      - name: sample-expressjs-build-gcp-gke-pulumi-py
        description: Build step for sample-expressjs-gcp-gke-pulumi-py
          - git
          - unzip
          - wget
          - tar
          - gcloud
          - kubectl
          - mkdir -p $BIN_LOCATION
          - export PATH=$PATH:$BIN_LOCATION
          - ls -asl $BIN_LOCATION

    - deploy-region:
        region: us-central
            only: main
    - deploy-region:
        region: europe-west
            only: main
    - deploy-region:
        region: asia-east
            only: main

  - name: update load balancer
    description: set up load balancer and utils
      - echo $GCLOUD_SERVICE_KEY > ${HOME}/gcloud-service-key.json
      - gcloud auth activate-service-account --key-file=${HOME}/gcloud-service-key.json
      - gcloud container clusters get-credentials YOUR_CLUSTER_NAME
  - name: Deploy to different regions
      - gcloud app deploy app.yaml --region=<< parameters.region >>

This configuration specifies a workflow that deploys an application to three different regions (us-central, europe-west, and asia-east) whenever changes are pushed to the master branch.

You can also update the file in the GCP workflow stack to support multip region deployments.

import os
from pulumi import export
from src.stack.main import Stack

STACK_ENV = os.getenv("STACK_ENV")
STACK = os.getenv("STACK")
STACK_TYPE = os.getenv("STACK_TYPE")
PULUMI_ORG = os.getenv("PULUMI_ORG")
CLOUD_SQL_REGION = os.getenv("CLOUD_SQL_REGION", "us-central1")
GCS_BUCKET_LOCATION = os.getenv("GCS_BUCKET_LOCATION", "us-central1")
CA_LOCATION = os.getenv("CA_LOCATION", "us-central1")

def main():
    stack = Stack(
        # ... [other parameters]
        # ... [other parameters]
        # ... [other parameters]
        # ... [other parameters]
        # ... [other parameters]
        # ... [other parameters]
    export("stack_outputs", stack.outputs)

if __name__ == "__main__":

With this, you can deploy in different regions by changing the environment variables needed for your resources.


When you push code to the main branch of your repository,

  • Authenticates with GCP
  • Deploys the application to each region specified in your application workflow, ensuring users from various geographic locations experience reduced latency.

Benefits of Multi-region Deployment via

  • Reduced Latency: Serving applications closer to the user base ensures faster load times.
  • High Availability: Even if one region faces an outage, your application remains accessible from other regions.
  • Streamlined Process: simplifies multi-region deployment, making it consistent and less error-prone.

Wrapping Up: Multi-region Deployments with

Deploying applications across multiple regions is an imperative step towards ensuring high availability, low latency, and disaster recovery. With and GCP, this process becomes not just efficient but also easily replicable. If you're looking to achieve global reach without the complexities traditionally associated with such deployments, this combination offers a streamlined path.