This content originally appeared on DEV Community and was authored by Athreya aka Maneshwar
Hi there! I’m Maneshwar. Right now, I’m building LiveAPI, a first-of-its-kind tool that helps you automatically index API endpoints across all your repositories. LiveAPI makes it easier to discover, understand, and interact with APIs in large infrastructures.
Migrating a virtual machine between cloud providers is a powerful way to gain flexibility and avoid vendor lock-in.
If you’ve been running a service on an AWS EC2 instance and are ready to move it to Google Cloud Platform (GCP), this post will walk you through how to export the entire EC2 instance (OS, files, configs, apps) and recreate it on GCP as a Compute Engine VM.
Why Migrate This Way?
Typical migrations often involve recreating environments and manually syncing files — which is time-consuming and error-prone. This approach allows you to:
- Preserve full OS and disk state
- Avoid surprises with missing configs or packages
- Port legacy apps without rebuilding
Prerequisites
- An AWS EC2 instance running Linux
- Access to AWS CLI & GCP SDK
- A GCP project and bucket ready for the import
- An S3 bucket in AWS for temporary storage
- IAM permissions on both sides
Step 1: Create an AMI of Your EC2 Instance
You can’t export a live EC2 instance directly — you first create an Amazon Machine Image (AMI).
aws ec2 create-image \
--region us-east-1 \
--instance-id i-xxxxxxxxxxxxxxxxx \
--name "my-ec2-export" \
--no-reboot
Save the ImageId
returned. You’ll need it in the next step.
Step 2: Export the AMI to an S3 Bucket
Before exporting, you need a service role named vmimport
with specific trust and permission policies. Here’s how to set that up.
1. Create a trust policy file (trust-policy.json
):
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": { "Service": "vmie.amazonaws.com" },
"Action": "sts:AssumeRole",
"Condition": { "StringEquals": { "sts:ExternalId": "vmimport" } }
}
]
}
2. Create the role:
aws iam create-role \
--role-name vmimport \
--assume-role-policy-document file://trust-policy.json
3. Attach permissions (role-policy.json
):
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:GetBucketLocation", "s3:GetObject", "s3:PutObject", "s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::your-s3-bucket",
"arn:aws:s3:::your-s3-bucket/*"
]
},
{
"Effect": "Allow",
"Action": [
"ec2:ExportImage", "ec2:Describe*"
],
"Resource": "*"
}
]
}
aws iam put-role-policy \
--role-name vmimport \
--policy-name vmimport \
--policy-document file://role-policy.json
Export the AMI as VHD
aws ec2 export-image \
--region us-east-1 \
--image-id ami-xxxxxxxxxxxxxxx \
--disk-image-format VHD \
--s3-export-location S3Bucket=your-s3-bucket,S3Prefix=ec2-export/ \
--role-name vmimport
Use describe-export-image-tasks
to monitor progress:
aws ec2 describe-export-image-tasks --region us-east-1
Step 3: Transfer to GCP and Import
1. Download the .vhd
from S3 and upload it to GCS
gsutil cp local-image.vhd gs://your-gcp-bucket/ec2-image.vhd
Or use a signed S3 URL and upload directly to GCP via server-side copy.
2. Import the image to GCP:
gcloud compute images import my-ec2-image \
--source-file=gs://your-gcp-bucket/ec2-image.vhd \
--os=debian-10 \
--timeout=60m
Step 4: Launch a VM from the Imported Image
gcloud compute instances create my-ec2-on-gcp \
--image=my-ec2-image \
--image-project=your-gcp-project \
--zone=us-central1-a \
--machine-type=e2-micro
Done!
You now have your original AWS EC2 instance running in Google Cloud — same OS, same configuration, same files. This method is particularly useful when dealing with:
- Legacy systems
- Deeply customized Linux environments
- Self-hosted apps not easily containerized
Tips
- Always test the image in a staging VM before routing production traffic.
- For Windows instances, the process is similar but requires additional licensing checks.
- Consider switching to GCP’s OS login and IAM-managed SSH for added security.
LiveAPI helps you get all your backend APIs documented in a few minutes.
With LiveAPI, you can generate interactive API docs that allow users to search and execute endpoints directly from the browser.
If you’re tired of updating Swagger manually or syncing Postman collections, give it a shot.
This content originally appeared on DEV Community and was authored by Athreya aka Maneshwar