Data Migration Testing Startegy




1.  Purpose of the Article

The purpose of this Aricle is to describe the different test stages that will be performed; how the testing will be prepared and executed for each stage, and the criteria that need to be met before transition from one stage to the next can occur.

2.  Phases involved in Data Migration Testing

Below are the phases involved in testing data migration.
·        Fusion
·        Release
1.      Design
2.      Implementation
a)      Ear
b)      Iterations
3.      SIT (System Integration Testing)
4.      UAT (User Acceptance Testing)

2.1.        Fusion (Discover and Define)
In this Phase the requirements for the Data Migration project is gathered and identified from the Testing perspective. QA is responsible for defining the scope of testing; high level estimation and Requirement Specification docs.

This phase we will perform pre-migration testing, these tests occur early in the migration process, before any migration, even migration for testing purposes, is completed. The pre-migration testing options include
·         Verify scope of source systems and data with user community and IT. Verification should include data to be included as well as excluded and, if applicable, tied to the specific queries being used for the migration.
·         Verify the source to target high-level mappings for each category of data or content and verify that the desired type has been defined in the destination system.
·         Verify destination system data requirements such as the field names, field type, mandatory fields, valid value lists and other field-level validation checks.
·         Using the source to destination mappings, test the source data against the requirements of the destination system. For example, if the destination system has a mandatory field, ensure that the appropriate source is not null, or if the destination system field has a list of valid values, test to ensure that the appropriate source fields contain these valid values.
·         Test source and target system connections from the migration platform.

Key inputs:
·         High Level Data Flow
·         High Level Use Cases/Specification
Key Outputs:
·         Requirement Specification Docs
·         Draft Version of Test Plan
·         Mapping Documents
·         High Level Estimation Plan

2.2.        Release
2.2.1.            Design
Conduct a formal design review of the migration specification when the pre-migration testing in near complete.
The specification should include:
·         A definition of the source systems
·         The source system’s data sets and queries
·         The mappings between the source system fields and the destination system
·         Number of source records
·         Number of source systems records created per unit time (to be used to define the migration timing and downtime)
·         Identification of supplementary sources
·         Data cleansing requirements
·         Performance requirements
·         Testing requirements

In this phase, the Mapping document has been defined which has detailed information about mapping of Legacy system to that of migrated system. QA persons responsible for closure of the test strategy plan and start writing high level scenarios/cases based on the requirement specification document and mapping documents. The aim of QA people is to understand the Data Model, field mappings with the help of requirement specification document and mapping document and create the Test scenarios. Team is responsible for identification the test data management process and defining the RTM. Existing database need to be reviewed in this phase.

Key inputs:
·         Detailed Business requirement/Specification Document
·         Mapping Document
Key Outputs:
·         Draft test plan & strategy
·         Signed –off Test Strategy
·         Signed -off Test Plan
·         High Level Test Scenarios
·         RTM (Scenarios mapped with requirements)

2.2.2.            Implementation
2.2.2.1.        EAR (Executable architectural release)
In this phase QA team create the detailed test cases and SQL scripts for validating the Migration. QA Team is responsible for test data set-up and is ensuring that the test cases are mapped to all the test scenarios which in turn are mapped to scope items in the requirements document. Team is responsible for checking the readiness of test environment like ETL process ran for updated/Refreshed Data in DB.

Key inputs:
·         Signed –off Test Scenarios
·         Mapping Document (contains business logic or transformation logic)
Key Outputs:
·         Signed –off Test cases & SQL Queries for Data validation
·         Test Data
·         Traceability Matrix ( Requirement mapped with test cases)
·         Test Environment readiness report
2.2.2.2.        Iterations
In this phase the QA persons performs the actual testing on data being migrated in iterations. During this phase tester would be performing system testing partially as per the scope of that particular iteration.

Key inputs:
·         Signed –off Test Cases
·         Test Environment and tools are running fine
·         Test Data
Key Outputs:
·         Test Results
·         Defect report
·         All System test cases executed successfully
·         No Sev1 and Agreed number of Open Sev2 Defects

2.2.3.            System Integration Testing (SIT)
        
Once a migration has been executed, end to end testing can be executed. Expect a significant sum of errors to be identified during the initial test runs although it will be minimized if sufficient pre-migration testing is well executed. Post-migration is typically performed in a test environment and includes:

·         Test the throughput of the migration process (number of records per unit time). This testing will be used to verify that the planned downtime is sufficient. For planning purposes, consider the time to verify that the migration process was completed successfully.
·         Compare Migrated Records to Records Generated by the Destination System – Ensure that migrated records are complete and of the appropriate context.
·         Summary Verification – There are several techniques that provide summary information including record counts and checksums. Here, the number of records migrated is compiled from the destination system and then compared to the number of records migrated.
Compare Migrated Records to Sources – Tests should verify that fields’ values are migrated as per the migration specification. In short, source values and the field level mappings are used to calculate the expected results at the destination. This testing can be completed using sampling if appropriate or if the migration includes data that poses significant business or compliance risk, 100% of the migrated data can be verified using an automated testing tool.
Key inputs:
·         Signed –off Test Cases
·         Test Environment and tools are running fine
·         Test Data
Key Outputs:
·         Test Results
·         Defect report
·         No Sev1 and Agreed number of Open Sev2 Defects.
·         UAT Preparedness (set of activities which does) 

2.2.4. USER ACCEPTANCE Testing (UAT)

                 In this phase the UAT team is involved in testing and making sure of the below:
·         Validate that the migration plan satisfies the business and technical requirements identified for the migration.
·         Find and resolve potential risks to the production environment
·         Attention should be given to reporting, downstream feeds, and other system processes that rely on migrated data
Key inputs:
·         QA Testing is complete and meets exit criteria
·         No Sev1 defects in OPEN state
·         UAT Test Cases have been written and Sign off happened by Business SME
·         UAT Test Data has been set up
·         All the necessary access has been granted to the business users
·         Release notes/Traceability Matrix has been made available
Key Outputs:
·         All in-scope tests for UAT have been executed at least once
·         UAT Test Results have been recorded in QC
·         No Sev1 and Agreed number of Open Sev2 Defects.
·         An action plan has been created for any agreed outstanding defects

3.  Architecture

Data migration testing is to compare migrated data with original data to find out any discrepancies.
When we are speaking database migration testing we can say it as ETL testing.

Below is the ETL architecture:


4.  Testing Approach

Data migration activity typically will include everything that is required with respect to data to ensure that the new database is up and running without any defects and it also contains all the legacy data ( data that is present in the existing database) , and the data has been migrated to correct table(s) and column(s).

Many constraints shape the style and amount of testing that we perform on a database. Testing can contain one or all of the following:
·         Simple data validation
·         Full life cycle of testing addressing individual unit tests
·         System and acceptance testing
There are several types of tests that we use during the migration process. During the testing stage, we go through several cycles of testing to enhance the quality of the database. The test cases we use should make sure that any issues encountered in a previous version of the DB database are not introduced again.

Figure 1 below indicates what you should consider testing when it comes to relational databases.
The diagram is drawn from the point of view of a single database; the dashed lines indicate threat boundaries, indicating that you need to consider threats both within the database.

Below picture depicts stages involved in Data Migration Testing (agile approach):



4.1.         To test the DB database:
1.    Create a controlled version of the migrated database. Keep the database migration scripts in a source control system.
2.   Design a set of test cases that we can use to test the DB database from unit to system level. The test cases should:

a.   Ensure the following:

v  All the users in the source database have migrated successfully
v  Privileges and grants for users are correct
v  Tables have the correct structure, defaults are functioning correctly,
v  and errors did not occur during mapping

b.   Validate that the data migrated successfully by doing the following:

v  Comparing the number of rows in the DB database with those in the source database
v  Calculating the sum of numerical columns in the DB database and compare with those in the source database

c.      Ensure that the following
v  Constraints: We cannot enter duplicate primary keys
v  Foreign keys prevent we from entering inconsistent data
v  Check constraints prevent we from entering invalid data

d.   Check that indexes and sequences are created successfully

e.   Ensure that triggers, procedures, and functions are migrated successfully. Check that the correct values are returned for triggers and functions

3.   Run the test cases against the migrated database.
4.   Create a report that evaluates the test case results.
These reports allow evaluating the data to qualify the errors, file problem reports, and provide a customer with a controlled version of the database.
5.   If the tests pass, go to Step 7
If all tests in the test cases pass or contain acceptable errors, the test passes. If acceptable errors occur, document them in an error report that we can use for audit purposes.
6.   If the test cases fail, do the following:

a.   Identify the cause of the error.

b.   Identify the test cases needed to check the errors.

c.   Log an issue on the controlled version of the migrated database code in the problem report.

d.   Add the test case and a description of the problem to the incident tracking system of yours organization, which could be a spreadsheet or bug reporting system. Aside from the test case, the incident log should include the following:

v  Provide a clear, concise description of the incident encountered
v  Provide a complete description of the environment, such as platform and source control version
v  Attach the output of the test; if useful indicate the frequency and predictability of the incident
v  Provide a sequence of events leading to the incident
v  Describe the effect on the current test, diagnostic steps taken, and results noted

e.   Attempt to fix the errors.

f.    Return to Step 1
7.   Identify acceptance tests that we can use to make sure the DB database is an acceptable quality level.

QA will focus on the following testing types for Data Migration.
1.      Functional Testing
a)      Data Validation
b)      Regression testing
2.      Non Functional Testing
a)      Performance testing 
b)      Load testing
3.      Integration Testing
4.      UAT Testing

4.2.1.            Functional Testing
All the functionality testing activities carried out in the Implementation phase of development would be categorized as Functional testing. Testing team will create functional test cases and ensure that these test cases are mapped to all the test scenarios which in turn are mapped to scope items in the requirements document. Testing activities in a particular iteration would depend upon the development stories being delivered.

4.2.1.1.        Data Validation Testing Approach and its Activities
Refer to section 4.1 to test the DB database

4.2.1.2.        Regression Testing Approach and its Activities
At the end of each iteration, application would be regress tested to ensure that any change in one part of application does not hamper functionalities of other areas.
·         Regression suite will consist of all high priority scenarios of the requirements delivered in a given iteration.
·         Re-validate of existing functionalities after fixes/changes.

4.2.2.            Non-Functional Testing

Testing team will create Non- functional test cases and ensure that these test cases are mapped to all the test scenarios which in turn are mapped to scope items in the requirements document.
Non-functional technical tests are tests that verify correct operation of migration mechanisms, such as transformation, loading, purging, incremental data migration. Of course, the roll-back and commit modes are verified.

4.2.3.            Integration Testing

The objective of integration testing is to verify that functional areas of system are working when integrated.

4.2.4.            User Acceptance Testing (UAT)

User acceptance testing provides an opportunity for the user community to interact with legacy data in the destination system prior to production release, and most often, this is the first such opportunity for the users. Attention should be given to reporting, downstream feeds, and other system processes that rely on migrated data.


Comments

  1. awesome job
    !! Please include the fallout process in detail

    ReplyDelete
  2. I got a good answer from the above description,but it still requires some more update to be made. Please share more content on MSBI Online Course

    ReplyDelete
  3. I was looking for the best data migration consulting companies through which I can identify the data migration procedure accurately. I must say the solutions, as well as the services offered by your company, have helped in moving the viable data.

    ReplyDelete
  4. TheAWS Database Migration Service helps you migrate data from your on-premises Oracle, Microsoft SQL Server, SQL Azure, Sybase, and DB2 databases to AWS.

    ReplyDelete
  5. Thanks for posting this blog on data migration testing strategy. I found this blog post very informative. The data migration services providers, providing high-quality metadata capture and data migration. They also provide drawing digitization services at affordable prices.

    ReplyDelete

Post a Comment

Popular posts from this blog

Informatica DVO installation steps

ETL Testing