How to Backup DynamoDB Data to Different AWS Account

2021/2/133 min read
bookmark this

This blog shows how to migrate DynamoDB data between different AWS Account. You can migrate DynamoDB by using different approaches. 1. AWS S3 with AWS Glue. 2. Data Pipeline. 3. Amazon EMR, here, we'll use S3 and focus on how to move DynamoDB backup data into different AWS account.

Following is the process of handling migration from one AWS account to another account and the requirement. 

  • Have an AWS account, use as the source account.
  • The source account has a DynamoDB table, and the bucket DynamoDB table will export to.
  • Have an AWS account, use as the target account, and the bucket which source account will copy backup DynamoDB files into.

Exports DynamoDB Table to S3

Here, we'll exports the DynamoDB table into the S3 bucket, login into AWS, and go to the DynamoDB section.

Enable point-in-time recovery

Before backup the DynamoDB table to S3, you have to enable the PITR, point-in-time recovery. After enabling the PITR, click exports to S3, then choose the Source table you want to back up to the same account's S3. 

Enter the destination S3 bucket name and click the export button, now export should be starting and will copy all the backup files for the DynamoDB table into the S3 bucket. Note that, you can backup to different AWS account directly, but it's not covered on this blog. 

Add Bucket Policy to Source Bucket

After DynamoDB had backed up to the S3 Bucket, next you'll need to modify the Bucket Policy. Replace following ACCOUNT-B-ID to account B's id, replace ACCOUNT-B-USER to the account B's user which later will perform S3 file sync. ACCOUNT-A-Bucket-Name replaces the one that has DynamoDB backup content.

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::{ACCOUNT-B-ID}:user/{ACCOUNT-B-USER}"
            },
            "Action": [
                "s3:ListBucket",
                "s3:GetObject"
            ],
            "Resource": [
                "arn:aws:s3:::{ACCOUNT-A-Bucket-NAME}/*",
                "arn:aws:s3:::{ACCOUNT-A-Bucket-NAME}" 
           ]
        }
    ]
}

Destination Bucket Setup

Setup Custom Policy

For the destination bucket setup, we need to set up a user and has permission to perform bucket sync. Following policy just tell that we'll have access to both buckets. 

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket",
                "s3:GetObject"
            ],
            "Resource": [
                "arn:aws:s3:::{source bucket name}/*"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket",
                "s3:PutObject",
                "s3:PutObjectAcl"
            ],
            "Resource": [
                "arn:aws:s3:::{destination bucket name}/*"
            ]
        }
    ]
}

Create a User and Attach the Policy

AWS Configure

Here, you'll run AWS configure and add use the user who has the custom policy.  

Execute the Command

At this point, both buckets should be ready and we can run the command to copy files from account A's bucket to account B's bucket.

aws s3 sync s3://{source bucket name} s3://{target bucket name} --source-region {source region} --region {target region}
// following is example 
aws s3 sync s3://my-dynamodb-backup s3://my-dynamodb --source-region us-west-2 --region us-west-1

Migrate Data from S3 to DynamoDB

Now, the S3 bucket should have backup DynamoDB tables, there are a few ways you can do to handler ETL to migrate data into DynamoDB. You can use Glue Job to read the file from the S3 bucket and write them to the target DynamoDB table or use the write Lambda function to read the S3 file and write into the DynamoDB table. Following are a few ways possible to migrate data from S3 to DynamoDB, but it's not part of this blog. 

  • Write Glue Job
  • Data Pipeline
  • Write Lambda
  • Use Amazon EMR