Back to Job Search Results

Data Quality Engineer

Date Posted: Jun 12, 2025

Job #1681774
Contract
Edina, Minnesota, United States

We have partnered with our client in their search for a Data Quality Engineer. 

The individual in this role will have the opportunity to oversee the development and execution of data validation strategies across the organization to ensure data integrity, accuracy, and alignment with company standards. This individual will guide team members in applying best practices for test design and planning and estimating high-impact data workflow modifications, while identifying opportunities for automating data quality checks. 

 

Responsibilities

  • Collects, analyzes, and validates data before it is stored for organizational use, ensuring accuracy across changes to views, stored procedures, batch processes, and diverse data sources.
  • Collaborates with product and data teams to test and deploy data workflow modifications based on collaboratively defined data transformation and integration requirements.
  • Participates in work refinement and estimation processes with data and product teams.
  • Prepares and executes testing on all new and modified code, ensuring compliance with organizational data quality standards.
  • Works with data, product, and change control teams to ensure quality data deliverables that provide customer value.
  • Creates and maintains documentation related to data flows, integration processes, support processes, and system changes, as defined by team and company standards.
  • Ensures timely delivery of data testing tasks and projects and communicates delays or roadblocks with team leadership as needed.
  • Collaborates with product and data teams to design, develop, test, and deploy high-impact data workflow modifications, including ETL / ELT processes that align with business data requirements.
  • Reviews test creation, planning, and execution results for all stages of testing, with a focus on validation where necessary.
  • Provides coaching, mentorship, and guidance for testing automation within the DevOps lifecycle, ensuring team deliverables are met.
  • Develops and implements data testing methodologies, standards, and best practices, including those related to data processes.
  • Applies standard statistical analysis and / or financial models to verify data acceptability and accuracy.
  • Identifies opportunities for process improvement, specifically in data collection processes, and collaborates with the team to correct deficiencies in systems.
  • Identifies and resolves data anomalies, ensuring that missing data is gathered and entered to maintain data accuracy.
  • Provides insights for defect-prone areas in the architecture based on metrics, root cause analysis, and organizational impacts.
  • Leads the resolution process for defects in data workflows, including processes, designing and implementing metrics, dashboards, and reporting capabilities.
  • Troubleshoots, resolves, and communicates production issues related to data processing and data workflows.
  • Guides the automation of data validation processes and finds opportunities to extend automation beyond current workflows.
  • Mentors and shares knowledge with junior team members to enhance skills in testing, automation, and data validation.
  • Identifies and implements improvements in process efficiencies, with a focus on data quality and user experience.
  • Guides the evolution of team-level best practices for data processes and quality assurance while considering their broader organizational impact.
  • Leads efforts to identify, review, refine, and resolve complex efficiency / effectiveness related issues, mobilizing the team to find solutions and improve processes with an emphasis on quality and positive user experience.
  • Utilizes specialized knowledge, experience, and creativity to develop multiple potential solutions, drawing on input from team members.
  • Guides the team in evaluating and communicating the pros and cons of all potential solutions, considering impacts on data quality and business processes.
  • Implements quality assurance practices to ensure data integrity throughout data workflow processes and proactively resolves any anomalies.
  • Understands and effectively communicates the relationships and dependencies between application components.
  • Collaboratively guides evolution of team-level development best practices, standards, and policies, taking into consideration team, product, architecture, and business priorities.
  • Consistently acts according to our customer experience standards, including responding quickly, maintaining a positive attitude, building rapport, demonstrating empathy, managing the customer’s expectations, using the proper communication channel for the situation, and taking ownership to ensure the customer’s issue is resolved.
  • Collaborates with direct teammates, product team leadership, managers, vendors, and business partners to ensure quality data integration processes.
  • Flexes to non-core responsibilities, including analysis, testing, deployment, and facilitation to ensure timely work delivery to stakeholders.

Skills Required

  • Strong mathematical, analytical, and problem-solving skills.
  • Ability to work independently and manage tasks to completion.
  • High attention to detail and quality in all data validation testing processes.
  • Understanding of data workflows test metrics and the ability to provide data validation metrics.
  • Strong communication skills and capability to convey information to technical and non-technical audiences.
  • Advanced knowledge of databases, data workflows tools, and data flows.
  • Experience in SQL query writing and debugging.
  • Experience in gathering and entering missing data to maintain data integrity.
  • Demonstrated advanced organizational and prioritization skills, managing multiple data validation testing projects of high impact and risk.
  • Ability to collaborate effectively with data, product, and development teams.
  • Advanced knowledge of data validation testing methodologies, tools, and data validation best practices.
  • Clear and effective communication skills across all levels of the organization, including technical and business stakeholders.
  • Proficiency in identifying areas of improvement in data collection processes or systems and recommending solutions.
  • Advanced knowledge of data modeling, testing dimensional tables, and leading the validation of type one and type two transformations and data integrity checks.
  • Strong proficiency in SQL and querying databases, leading data warehouse and data lake testing strategies.
  • Experience with automation of data validation testing processes and overseeing stored procedure development.
  • Familiarity with data warehousing concepts and data modeling.
  • Ability to apply statistical and financial models to data validation processes.
  • Expertise in designing and leading enterprise-level data streaming architectures using Kafka, ESB, etc.
  • Bachelor’s degree in computer science, data science, or related field; experience in lieu of degree is acceptable.
  • Extensive experience with SQL tools for complex queries, testing, and automation.
  • Strong knowledge of ETL / ELT processes and advanced data quality assurance methodologies.
  • Proven experience with integrations involving APIs, service layers, and microservices.
  • Hands-on experience with testing frameworks for data science and machine learning workflows.
  • Proficiency in version control tools (e.g., Git) for managing test scripts, workflows, and collaboration.
  • Expertise in project management software, such as Jira or Azure DevOps, with a focus on leading Agile teams.
  • Advanced knowledge of data pipelines and workflows, including data warehousing and real-time streaming systems.
  • Property-and-casualty insurance experience is highly preferred.
  • Demonstrated ability to lead and mentor a team by providing technical guidance and driving process improvements.
  • Strong communication skills for collaborating with cross-functional teams and presenting complex findings to technical and non-technical stakeholders.
  • Experience with data governance and compliance frameworks to ensure high-quality, reliable data.

Education & Work Experience

  • Degree
  • Work Experience

Title 

Location 

Client Industry 

Compensation 

Ref ID 

Apply Now

Accepted file types are DOC, DOCX, PDF, HTML, and TXT.

Mandatory questions are indicated. All other questions are optional. I agree that any sensitive personal information I voluntarily provide in response to optional questions will be handled in accordance with the Global Privacy Policy. I acknowledge and agree to receive communications from Korn Ferry via phone, SMS and email (message frequency varies, SMS message and data rates may apply). I am not a citizen of, ordinarily resident, or physically located in Cuba, Iran, North Korea, Syria, or the Crimea, Donetsk, or Luhansk regions of Ukraine nor ordinarily resident or physically located in the Russian Federation. I understand that I can withdraw this consent at any time by contacting privacy@kornferry.com.