Test Scenarios to be considered during each phase of E-T-L process till Reporting

During Extraction Process


  1. Extraction scripts are extracting all the attributes/columns required as per the extraction design
  2. Extraction scripts have access to all the source systems from which data needs to be extracted
  3. Extracted data format (flat file/complex file/table) is as per the interface specification design
  4. Extraction scripts are extracting complete data (full dump/delta) as per Changed Data Capture (CDC) design
  5. Appropriate audit/error logs for extraction are getting generated
  6. Extraction process is getting finished within the expected time frame

During Transformation Process


  1. All the data required for loading has been extracted and is in correct format
  2. Attribute level source to target mapping is as per mapping design
  3. Business rules have been implemented correctly as per mapping design
  4. There is no data leakage during various transformation phases in workflow
  5. Data being rejected for any reason (PF-FK integrity, data type etc) is being recorded for audit purpose
  6. Appropriate audit/error/performance/run logs are getting generated
  7. Transformation processes are getting finished within stipulated timeframe

During Loading Process


  1. Physical Data Model of DW has all the entities/attributes to deliver reports/KPIs
  2. All the records/attributes coming out of transformation stage are getting loaded in the target
  3. Data being rejected for any reason (PF-FK integrity, data type etc) is being recorded for audit purpose
  4. Appropriate logs for loading activity are getting generated
  5. Record count matches for target tables with the same granularity as source data (usually dimension/master tables)
  6. Summarized values of measures match with source data in case of aggregation (usually fact tables)
  7. Data loaded in DW is loaded/transformed correctly as defined in mapping design
  8. Loading processes are getting finished within stipulated timeframe

During Data mart loading Process


  1. Physical Data Model of DM has all the entities/attributes to deliver reports/KPIs
  2. Data being rejected for any reason (PF-FK integrity, data type etc) is being recorded for audit purpose
  3. Appropriate logs for loading activity are getting generated
  4. Record count matches for target tables with the same granularity as source data (usually dimension/master tables)
  5. Summarized values of measures match with source data in case of aggregation (usually fact tables)
  6. Loading processes are getting finished within stipulated timeframe

During Reporting Process


  1. Business reports and dashboards are displaying correct data
  2. Completeness of report structure in terms of format, attributes displayed, report level filters/derived calculation, prompts
  3. Ad-hoc report users have all the attributes available in semantic layer which they need for ad-hoc reporting
  4. Users have access to only those reports (also slice of data in report) which they are authorized to access
  5. Report bursting has been implemented correctly for scheduled reports
  6. Reports (canned, ad-hoc, scheduled) are refreshing within acceptable timeframe
  7. Security profiles for different type (basic, power) users have been setup correctly

Down Stream Feed


  1. Downstream systems has access to (all/subset of) DW/DM objects for data extraction
  2. DW/DM has all the attributes at required grain for downstream systems
  3. Downstream systems always get complete feed from DW/DM. DW/DM should not send any feed to downstream system if the loading into DW/DM happened partially.
  4. Data sent to downstream systems is complete and in format as per interface specification design
  5. Appropriate audit/error logs are getting generated


Comments

Popular posts from this blog

Data Migration Testing Startegy

Informatica DVO

Differences between OLTP and OLAP