Key Responsibilities
• Collaborate with Data Engineering, DevOps, BI teams, and Business SMEs to support DataOps activities.
• Assist in defect triage, root cause analysis, and resolving data issues.
• Support data quality monitoring, validation, and improvement processes.
• Assist in resolving production incidents (L2/L3) related to data defects and pipeline failures.
• Perform regression testing during releases and validate hotfixes.
• Conduct QA for data pipelines, transformations, schema changes, and reporting enhancements.
• Validate end-to-end data flow from source systems to BI reporting.
• Apply data quality rules such as accuracy, completeness, timeliness, and consistency.
• Validate ETL/ELT pipelines and business rules.
• Perform data reconciliation, variance analysis, and SQL-based validation.
• Support pipeline automation using tools such as DBT, Dataiku, and Airflow.
• Contribute to automated testing frameworks and CI/CD pipeline testing.
• Maintain test documentation, runbooks, and operational dashboards.
Key Requirements
• Bachelor’s degree in Computer Science, Software Engineering, Information Systems, or a related field.
• 3–5+ years of experience in data-focused QA or testing roles.
• Experience testing ETL processes, data warehouses, data lakes, and data transformations.
• Strong SQL knowledge (joins, aggregations, window functions, subqueries).
• Familiarity with automation tools such as Selenium, QuerySurge, Datagaps, AccelQ, or Pytest.
• Exposure to BI tools such as Power BI or Tableau.
• Basic knowledge of Snowflake, Azure Synapse, or BigQuery.
• Experience working in Agile/Scrum environments.
• Good analytical, problem-solving, and communication skills.
Preferred Qualifications
• QA certifications such as ISTQB, CSTE, or CSQE.
• Familiarity with AI-based testing tools and compliance standards.