Key Responsibilities :
- Analyze business and technical requirements for data movement and transformation processes.
- Create and execute test plans , test cases , and test scripts for ETL pipelines.
- Perform source-to-target mapping (S2T) validation and data reconciliation .
- Validate ETL transformation logic , data loads , aggregations , and data quality rules .
- Conduct black-box , white-box , and integration testing on data flows.
- Write complex SQL queries to validate data accuracy across databases (Oracle, SQL Server, PostgreSQL, etc.).
- Work with ETL developers to understand mappings and raise defects using tools like JIRA , ALM , or Bugzilla .
- Participate in regression testing during code changes or production deployments.
- Automate database testing scripts and contribute to continuous integration testing pipelines (if applicable).
- Document test results and create reports for QA sign-off and audits.
Required Skills and Qualifications :
Bachelor's degree in Computer Science, Information Technology, or a related field.3–6+ years of experience in ETL testing , data warehouse testing , or database testing .Proficient in SQL for data validation and test script development.Hands-on experience with ETL tools (e.g., Informatica , Talend , SSIS , DataStage ).Experience with databases like Oracle , SQL Server , MySQL , or PostgreSQL .Familiarity with data warehousing concepts (Star / Snowflake schemas, OLAP, OLTP).Experience with defect tracking and test management tools.Preferred Qualifications :
Experience testing in cloud data platforms (AWS Redshift, Snowflake, Azure Synapse, GCP BigQuery).Knowledge of BI / reporting tools (Tableau, Power BI, Qlik) for end-to-end data validation.Basic scripting or automation skills in Python , Shell , or PowerShell .Exposure to Agile / Scrum environments and CI / CD tools (Jenkins, Git).Skills Required
Aws Redshift, snowflake , Azure Synapse, Oracle, Sql Server, Mysql, Informatica, Talend