Skip to content

Gemini-powered Amazon S3 to Cloud Storage migration recommendations

This project provides a set of Gemini CLI custom commands to accelerate assessments and migration of workloads and data from Amazon Web Services (AWS) S3 to Google Cloud Storage. The Gemini CLI custom commands use Gemini to analyze data, identifying suitable resources for migration, and augmenting traditional assessments with AI.

Prerequisites

AWS Permissions

To run the inventory extraction script (s3_inventory_generator.sh) on your source bucket, your AWS User or Role requires the following Read permissions:

  • s3:ListBucket (and s3:ListBucketVersions or s3:ListObjectVersions)
  • s3:GetBucketLocation
  • s3:GetBucket* (e.g., GetBucketPolicy, GetBucketTagging, GetBucketEncryption)
  • s3:GetObject (Required to read object metadata/headers)
  • s3:GetObjectTagging

AWS Permissions for Demo Script

This demo requires an S3 bucket with objects being setup in order to run the custom Gemini commands. This script contains a shell script setup-s3-demo.sh to create an AWS bucket with objects for demo purposes. Below, are the AWS permissions necessary to run setup-s3-demo.sh.

  • AWS Permissions: An AWS user account with Write permissions to create resources. Least-privilege policy example:
    • s3:CreateBucket
    • s3:PutBucketTagging
    • s3:PutObject
    • s3:ListAllMyBuckets

Google Cloud Permissions

To execute the migration commands generated by Gemini (creating buckets and transfer jobs), ensure your Google Cloud User or Service Account has the Owner role (roles/owner) on the project.

Requirements

To effectively utilize these instructions, ensure the following are set up:

  • Gemini CLI: For installation instructions, visit Gemini CLI Deployment Guide
  • AWS CLI: Install AWS CLI

    1. Verify AWS CLI version:

      aws --version
      
    2. Configure AWS Credentials:

      aws configure
      

      Provide your AWS Access Key ID, Secret Access Key, and default region.

Project Setup

Before running any scripts, you must install the Gemini CLI and clone this repository to access the necessary tools and scripts.

  1. Clone the Repository:

    git clone https://github.com/GoogleCloudPlatform/cloud-solutions.git
    

Prepare Inventory Data

You need an inventory file to run the analysis. Choose one of the following options to either generate live data from AWS or use a local sample file.

To test the migration, you need an inventory file containing a list of objects. Use the table below to determine which option best fits your environment:

Option Use Case Requirements
Option A Full Demo Creation: You want to generate a fresh AWS bucket with test data and scan it. AWS Credentials
Option B Bring Your Own Bucket: You already have an S3 bucket you want to analyze. AWS Credentials & Existing Bucket
Option C Quick Start / No AWS: You do not have AWS access or want to use the provided mock data. None
Option D Bring Your Own File: You already have an S3 inventory CSV file generated. Existing Inventory CSV

Option A

Create & Analyze a New Demo Bucket (Requires AWS Credentials)

Use this option to create a new S3 bucket filled with generated data, and then scan it. This is best for a full end-to-end test.

  1. Navigate to the project directory:

    cd cloud-solutions/projects/migrate-from-aws-to-google-cloud/s3-to-cloud-storage
    
  2. Set Environment Variables: Customize the following variables as needed:

    export BUCKET_NAME="mmb-$(date +%s)"
    export AWS_REGION="us-east-2"
    export PROJECT_TAG="mmb"
    export COST_CENTER_TAG="1234"
    
  3. Run the Setup Script:

    ./scripts/setup-s3-demo.sh
    

    When prompted, enter yes to approve the creation of AWS resources. (Proceed to the Inventory Data Generation section below).

  4. Run the Script: Execute the script against the specific AWS bucket you want to analyze (using the variable you exported earlier):

    ./scripts/get_bucket_inventory.sh $BUCKET_NAME
    
  5. Verify Output & Export Variable: The script will create a file named ${BUCKET_NAME}_inventory.csv. Run the following to export this file path for the next step:

    export INVENTORY_FILE="${BUCKET_NAME}_inventory.csv"
    

Option B

Generate Live AWS Data (Requires AWS Credentials)

Use this option if you already have an S3 bucket you want to use for this test.

  1. Identify Your Bucket: Export the name of your existing bucket:
  export BUCKET_NAME="YOUR AWS BUCKET NAME"
  1. Run the Script: Execute the script against the specific AWS bucket you want to analyze (using the variable you exported earlier):

    ./scripts/get_bucket_inventory.sh $BUCKET_NAME
    
  2. Verify Output & Export Variable: The script will create a file named ${BUCKET_NAME}_inventory.csv. Run the following to export this file path for the next step:

    export INVENTORY_FILE="${BUCKET_NAME}_inventory.csv"
    

Option C

Use Sample Data (No AWS Environment Required)

Use this option if you do not have an AWS environment. You will use a pre-generated sample inventory file.

  1. Set Environment Variables:

    export INVENTORY_FILE="test-data/aws-s3-migration-analysis/SAMPLE-S3_inventory.csv"
    

Option D

Use Your Own Inventory File

Use this option if you have already generated an inventory CSV file from your own S3 environment and want to use it for this analysis. If you need to generate this file, follow the instructions: Build an inventory of your Amazon S3 buckets

  1. Set Environment Variables:

    export INVENTORY_FILE="PATH_TO_YOUR_INVENTORY_FILE"
    

Run the Analysis (Gemini)

Use a Gemini CLI custom command to analyze the object inventory of a Amazon S3 bucket. The analysis calculates aggregate metrics, assesses migration complexity, and generates steps and code for setting up Storage Transfer Service to migrate the data to Cloud Storage:

  1. Change the working directory:

    cd "$(git rev-parse --show-toplevel)/projects/gemini-powered-migrations-to-google-cloud"
    
  2. Run the inventory analysis command within the Gemini CLI:

    gemini
    

Execution & Interactive Example

  1. Start Gemini CLI and run the command:

    /aws-s3-migration-analysis analyze file ${INVENTORY_FILE}
    

    The tool will output a markdown report detailing migration recommendations. Click here to view a sample of the generated report

Press Ctrl + C twice to exit Gemini CLI.

Cleanup

AWS

To delete the AWS bucket resources, run:

 cd "$(git rev-parse --show-toplevel)/projects/migrate-from-aws-to-google-cloud/s3-to-cloud-storage/scripts/"
 ./cleanup-aws-s3-demo.sh ${BUCKET_NAME}

When prompted, enter DELETE to confirm the deletion of the S3 bucket and its contents.

Google Cloud

To delete the Storage Bucket and Transfer job, run:

 ./cleanup-gcp-demo.sh <PROJECT_ID> <BUCKET_NAME> <JOB_NAME>