“Challenges faced to find the solution of how to migrate the data from rds to s3”. Firstly I have checked various methods from which I can do the migration. I got AWS Datapipeline, AWS Glue, manually from rds export option or AWS Database migration service. I have also checked the costing of services as per the budget and scenario. Finally taken database migration service to do migration of data from rds to s3.
AWS Database Migration Service helps you migrate databases to AWS quickly and securely. The source database remains fully operational during the migration, minimizing downtime to applications that rely on the database. The AWS Database Migration Service can migrate your data to and from most widely used commercial and open-source databases.
AWS Database Migration Service supports homogeneous migrations such as Oracle to Oracle, as well as heterogeneous migrations between different database platforms, such as Oracle or Microsoft SQL Server to Amazon Aurora. With AWS Database Migration Service, you can continuously replicate your data with high availability and consolidate databases into a petabyte-scale data warehouse by streaming data to Amazon Redshift and Amazon S3. Learn more about the supported source and target databases. To learn more, read the AWS Database Migration Service.
In this post, you will get to know how to do migration of data from RDS to S3 using AWS database migration service. Here I have taken an S3 bucket, ec2 server and rds. And then created an IAM role and AWS database migration service in which migration of data from rds to s3 is to be done.
You’ll need an Amazon Simple Storage Service, Amazon Elastic Compute Cloud and Amazon Relational Database Service for this post. Getting started with Amazon Simple Storage Service provides instructions on how to create a bucket in simple storage service. Getting started with Amazon Elastic Compute Cloud provides instructions on how to create an ec2 server. Getting started with Amazon Relational Database Service provides instructions on how to create a relational database. For this blog, I assume that I have a bucket, ec2 server and rds.
The blog post consists of the following phases:
- Create IAM role and replication instance in AWS Database Migration Service
- Create the source and target endpoint for migration of data in database migration service
- Create a database migration task for migrating the existing data from source to target
- Testing for migration of data from rds to s3 using database migration service task
- Open the IAM console and create a role named as mysqltos3 with amazonrdsfullaccess and amazons3fullaccess.
- Open the AWS Database Migration Service console and click on create replication instance.
- Give the name of the replication instance as dmsreplicationinstance and choose the instance class as dms.t3.medium (or as per requirement). Set engine version, storage, vpc, publicly accessible, subnet group, availability zone, security group and kms as per requirement with default settings and create the replication instance.
- Once the replication instance is in an available state, we can check all the configured details of the replication instance created.
- In AWS Database Migration Service, Create the source endpoint. Select the rds db instance checkbox and then choose the rds instance. Set the endpoint database access option as provide access information manually, Input other required details in it with default kms key. Run the test with the default option set. Once the test is successful, create the source endpoint.
- In AWS Database Migration Service, Create the target endpoint. Set the endpoint identifier as s3 target and choose the target as aws s3. Input the service access arn role and bucket name created earlier. Input other required details in it with default kms key. Run the test with the default option set. Once the test is successful, create the target endpoint.
- Get the source and target endpoint in active state once completed successfully.
- In AWS database migration service, create a database migration task named as dmsreplicationtaskofexisting. Select the replication instance, source endpoint, target endpoint and migration type as migrate existing data. Set target table mode as drop tables on target and leave other options as default.
- In the selection rule for table mapping, choose enter a schema, schema name as %, table name as % and action as include. We can set the mode of migration task automatically on creation or manually later, so i have chosen manually later. And created the data migration task.
- Login to ec2 server, run the command for login to mysql database using master username and password credentials. Then import the dump database to rds using the command line.
- Start the data migration task by modifying the task created, once the replication is started it will replicate all data to s3 bucket created. Also can see the status of running the task with the number of tables completed to load in s3. We can see the migration activity on the console of table statistics in the database migration task running console.
- Checkout the s3 bucket where we can get the output as a database folder wise and tables in csv format. We can check and compare the database name from the server terminal with s3 bucket database folders. Also checkout the metrics in cloudwatch.
Delete the environment as: S3 bucket, EC2 server, RDS, IAM role, AWS Database Migration Service.
I review the pricing and estimated cost of this example -
For Simple Storage Service →
Cost = $0.01
For Relational Database Service →
Cost = $1.21
For Elastic Compute Cloud →
Cost = $0.20
For Database Migration Service →
Cost = $0.32
For cloudwatch, data transfer and kms (under free tier) →
Cost = $0.00
Total Cost = $(0.01+1.21+0.20+0.32+0.00) = $1.74
In this post, I have shown you how to do migration of data from rds to s3 using aws database migration service.
For more details on database migration service, Checkout Get started with AWS Database Migration Service - open the AWS Database Migration Service console. To learn more, read the AWS Database Migration Service documentation.
Thanks for reading!
Connect with me: Linkedin