Gemini-powered Amazon S3 to Cloud Storage migration recommendations¶
This project provides a set of Gemini CLI custom commands to accelerate assessments and migration of workloads and data from Amazon Web Services (AWS) S3 to Google Cloud Storage. The Gemini CLI custom commands use Gemini to analyze data, identifying suitable resources for migration, and augmenting traditional assessments with AI.
Prerequisites¶
AWS Permissions¶
To run the inventory extraction script (s3_inventory_generator.sh) on your
source bucket, your AWS User or Role requires the following Read
permissions:
s3:ListBucket(ands3:ListBucketVersionsors3:ListObjectVersions)s3:GetBucketLocations3:GetBucket*(e.g.,GetBucketPolicy,GetBucketTagging,GetBucketEncryption)s3:GetObject(Required to read object metadata/headers)s3:GetObjectTagging
AWS Permissions for Demo Script¶
This demo requires an S3 bucket with objects being setup in order to run the
custom Gemini commands. This script contains a shell script setup-s3-demo.sh
to create an AWS bucket with objects for demo purposes. Below, are the AWS
permissions necessary to run setup-s3-demo.sh.
- AWS Permissions: An AWS user account with Write permissions to create
resources. Least-privilege policy example:
s3:CreateBuckets3:PutBucketTaggings3:PutObjects3:ListAllMyBuckets
Google Cloud Permissions¶
To execute the migration commands generated by Gemini (creating buckets and
transfer jobs), ensure your Google Cloud User or Service Account has the
Owner role (roles/owner) on the project.
Requirements¶
To effectively utilize these instructions, ensure the following are set up:
- Gemini CLI: For installation instructions, visit Gemini CLI Deployment Guide
-
AWS CLI: Install AWS CLI
-
Verify AWS CLI version:
-
Configure AWS Credentials:
Provide your AWS Access Key ID, Secret Access Key, and default region.
-
Project Setup¶
Before running any scripts, you must install the Gemini CLI and clone this repository to access the necessary tools and scripts.
-
Clone the Repository:
Prepare Inventory Data¶
You need an inventory file to run the analysis. Choose one of the following options to either generate live data from AWS or use a local sample file.
To test the migration, you need an inventory file containing a list of objects. Use the table below to determine which option best fits your environment:
| Option | Use Case | Requirements |
|---|---|---|
| Option A | Full Demo Creation: You want to generate a fresh AWS bucket with test data and scan it. | AWS Credentials |
| Option B | Bring Your Own Bucket: You already have an S3 bucket you want to analyze. | AWS Credentials & Existing Bucket |
| Option C | Quick Start / No AWS: You do not have AWS access or want to use the provided mock data. | None |
| Option D | Bring Your Own File: You already have an S3 inventory CSV file generated. | Existing Inventory CSV |
Option A¶
Create & Analyze a New Demo Bucket (Requires AWS Credentials)¶
Use this option to create a new S3 bucket filled with generated data, and then scan it. This is best for a full end-to-end test.
-
Navigate to the project directory:
-
Set Environment Variables: Customize the following variables as needed:
-
Run the Setup Script:
When prompted, enter
yesto approve the creation of AWS resources. (Proceed to the Inventory Data Generation section below). -
Run the Script: Execute the script against the specific AWS bucket you want to analyze (using the variable you exported earlier):
-
Verify Output & Export Variable: The script will create a file named
${BUCKET_NAME}_inventory.csv. Run the following to export this file path for the next step:
Option B¶
Generate Live AWS Data (Requires AWS Credentials)¶
Use this option if you already have an S3 bucket you want to use for this test.
- Identify Your Bucket: Export the name of your existing bucket:
-
Run the Script: Execute the script against the specific AWS bucket you want to analyze (using the variable you exported earlier):
-
Verify Output & Export Variable: The script will create a file named
${BUCKET_NAME}_inventory.csv. Run the following to export this file path for the next step:
Option C¶
Use Sample Data (No AWS Environment Required)¶
Use this option if you do not have an AWS environment. You will use a pre-generated sample inventory file.
-
Set Environment Variables:
Option D¶
Use Your Own Inventory File¶
Use this option if you have already generated an inventory CSV file from your own S3 environment and want to use it for this analysis. If you need to generate this file, follow the instructions: Build an inventory of your Amazon S3 buckets
-
Set Environment Variables:
Run the Analysis (Gemini)¶
Use a Gemini CLI custom command to analyze the object inventory of a Amazon S3 bucket. The analysis calculates aggregate metrics, assesses migration complexity, and generates steps and code for setting up Storage Transfer Service to migrate the data to Cloud Storage:
-
Change the working directory:
-
Run the inventory analysis command within the Gemini CLI:
Execution & Interactive Example¶
-
Start Gemini CLI and run the command:
The tool will output a markdown report detailing migration recommendations. Click here to view a sample of the generated report
Press Ctrl + C twice to exit Gemini CLI.
Cleanup¶
AWS¶
To delete the AWS bucket resources, run:
cd "$(git rev-parse --show-toplevel)/projects/migrate-from-aws-to-google-cloud/s3-to-cloud-storage/scripts/"
./cleanup-aws-s3-demo.sh ${BUCKET_NAME}
When prompted, enter DELETE to confirm the deletion of the S3 bucket and its
contents.
Google Cloud¶
To delete the Storage Bucket and Transfer job, run: