a

Migrate PostgreSQL AWS Database Instances Quickly

Learn how to migrate PostgreSQL AWS database instances like an expert. We had the need to migrate a Postgres AWS database from a Production to Staging EC2 yesterday, so we decided to write it up for posterity. Navigate out to the production instance via ssh. I will typically write a shell script that executes the following […]

Learn how to migrate PostgreSQL AWS database instances like an expert.
We had the need to migrate a Postgres AWS database from a Production to Staging EC2 yesterday, so we decided to write it up for posterity.
Navigate out to the production instance via ssh. I will typically write a shell script that executes the following command. You will need to make sure that pem file has the proper permissions.

ssh -i amazon-production.pem [email protected]

Once you’re out on the production box, we’re going to make use of the pg_dump command. There are a number of ways to execute the pg_dump command. I’ve had the best success using the following command where the postgres database is running under the user postgres. After the pg_dump command is the name of the database. We’re going to pipe the entire database contents out to a file in the/home/deploy directory, which will have all of the contents of the db.

sudo -u postgres pg_dump database_name > /home/deploy/database_name.sql

Open the database_name.sql file in vi, you’ll see raw ddl.
Next, we need to get the database_name.sql off of the production instance. For this example, we’re just going to scp the file down to my box. The following scp command would be executed from my laptop. This command will bring the database_name.sql down to my machine.

scp -i amazon-production.pem [email protected]:/home/deploy/database_name.sql database_name.sql

After we have the database_name.sql on my local drive, the next step is to upload the file to the staging server. We again use a scp command to place the database_name.sql up on staging.

scp -i amazon-staging.pem database_name.sql [email protected]:/home/ubuntu/database_name.sql

Next, we ssh into the staging server

ssh -i amazon-staging.pem [email protected]

Once, we’re in on the staging server, we need to get into the postgres user and create a new database, we can do that with the following commands:

sudo su postgres

psql

create database database_name template0;

Next, we’ll import the database_name.sql into the freshly created database_name

psql database_name < /home/ubuntu/database_name.sql

When you navigate into the new database with psql, you should be able to select the database and see the tables

\c database_name

\dt

In certain situations, you may need to rename a database name or two, in these situations, I’d use the following sql command from psql

alter database database_name rename to new_database_name

In certain situations, the database may have open connections, which prevent you from renaming. In these situations, you may want to kill the existing sessions with the following command:

SELECT pg_terminate_backend( pid ) FROM pg_stat_activity WHERE pid <> pg_backend_pid( ) AND datname = ‘database_name’;

Now you can migrate Postgres AWS databases like a champ.


Joel Garcia

Joel Garcia

Joel Garcia has been building AllCode since 2015. He’s an innovative, hands-on executive with a proven record of designing, developing, and operating Software-as-a-Service (SaaS), mobile, and desktop solutions. Joel has expertise in HealthTech, VoIP, and cloud-based solutions. Joel has experience scaling multiple start-ups for successful exits to IMS Health and Golden Gate Capital, as well as working at mature, industry-leading software companies. He’s held executive engineering positions in San Francisco at TidalWave, LittleCast, Self Health Network, LiveVox acquired by Golden Gate Capital, and Med-Vantage acquired by IMS Health.

Related Articles

AWS Snowflake Data Warehouse Pricing Guide

AWS Snowflake Data Warehouse Pricing Guide

AWS Snowflake Data Warehouse – or just Snowflake – is a data cloud built for users to mobilize, centralize, and process large quantities of data. Regardless of how many sources are connected to Snowflake or the user’s preferred type of organized data used, data is easily stored and controllably shared with selectively-authorized access. Snowflake does offer extensive control over its pricing, though how it works isn’t always clear.

Guide to Cost Factors for Amazon’s RDS Pricing

Guide to Cost Factors for Amazon’s RDS Pricing

Amazon sports a complex pricing model. As simple as the pay-for-what-you-use model sounds, there are several nuances to every service from data used to computing components chosen that can easily let users blow through their entire budget. That’s not to say AWS is an inherently expensive platform and can be adapted to with proper planning and understanding of the contributing factors to billing.

Amazon Elastic Cloud Computing Pricing Guide

Amazon Elastic Cloud Computing Pricing Guide

Amazon Elastic Cloud Computing is the default option for computing on AWS. Outside of outsourced cloud computing options, it is the default service for building, running, and scaling AWS-based applications. As such, EC2 will likely be the main driving force behind AWS bills. Understanding how to control said costs is therefore the most important factor in managing your AWS environment.

Download our 10-Step Cloud Migration ChecklistYou'll get direct access to our full-length guide on Google Docs. From here, you will be able to make a copy, download the content, and share it with your team.