A Customer Journey

Modernizing IBM DB2 to AWS Aurora PostgreSQL

aws-partner-badge

Modernizing IBM DB2 to AWS Aurora PostgreSQL

The scope of work in this project was to migrate all of our customer's IBM DB2 databases to AWS Aurora PostgreSQL. Our client wanted to free themselves of yearly costly license renewals, accelerate their migration to AWS and increase their footprint in fully managed services.

Database Assessment

Our initial step was to assess all the objects in the database to determine the difficulty migrating to a managed service like AWS Aurora PostgreSQL. Most objects can be easily converted to PostgreSQL, other objects such as packages, procedures and functions are more complicated and often require rewriting code. Specfic database features also need to be checked for a PostgresSQL equivalent that is supported by AWS Aurora. In some cases, application code needs to be modified to support the new database platform. A full assessment is necessary to idenitify possible delays and work that that may need to be required by the application team.

Database Object Creation

Once all potential issues and possible delays have been identified, database objects are converted and created in the target system. Packages, procedures and functions are fully tested. Users are created and grants are copied over. All parameters and options are checked and copied to parameter and objects groups within the AWS RDS console. Python scripts are used to confirm all objects are created in the target RDS database.

Manual Fixes to Database Objects

When all baseline objects are created, manual work begins on any stored code procedures, packages, and functions. From our extensive experience in this task, we have created our own list of best practices ensuring successful migration of code. Full testing of these functions are done with data that we supply. In order not to bother our customers too much, we take the extra time to check table data and fully understand code in order to create proper test cases.

AWS Resource Creation

AWS resources are then created through Terraform or CDK. Resources include AWS Aurora Postgres instance, AWS Database Migration Instance, AWS DMS endpoints and tasks and Security groups. Development and Production solutions are designed at this time. If writing code in CDK, production setup is coded in the application. Production setup will enable multi-az high-availability on resources.

AWS Database Migration Service, DMS, will be the tool that will be used to load and convert data from IBM DB2 to AWS RDS PostgreSQL.

Setup AWS DMS and load data

After database objects are created and verified, python scripts are used to disable all constraints and triggers in the database. AWS DMS is then setup to do a full load and CDC replication to the target database. DMS tasks are set to validate and verify all migrated data.

Application Conversion Effort

After the data is loaded, we assisted the application team in converting SQL statements found in application code to PostgreSQL. This is a usually manual effort and can take time to work with internal application teams. Full application testing is done at this stage.

Cut-Over, Rollback Plans and Hypercare

At this point we are ready to cut-over our first system which is usually DEV. Cut-over plans and rollback options are discussed and agreed on. This cut-over plan will be executed again as we do QA and PROD. Production will be the third time going through the cut-over process and generally tend to go seamlessly.

Acoustic

Services

Cloud Consulting Services
Advisory Services
Project Management

Technologies

AWS EC2
AWS Lambda
IBM DB2
Terraform
Ansible
Liquibase
Direct Connect
AWS VPN
AWS DMS
AWS SCT