Back to Job Search Results

Data Engineer - Databricks

Date Posted: Jun 29, 2025

Job #1681971
Contract
Houston, Texas, United States

Position: Senior Data Engineer – Databricks Specialist

Overview
Join one of the world’s leading integrated energy and commodity trading firms in a pivotal role that drives data innovation. We are looking for a seasoned Data Engineer with deep expertise in Databricks and modern data architecture. This role is ideal for someone passionate about transforming legacy systems and delivering scalable, high-impact data solutions.

Key Responsibilities

  • Design and implement scalable, secure, and high-performance data solutions using Databricks and other modern data platforms.
  • Lead the migration of data from traditional RDBMS systems to Databricks, ensuring minimal disruption and maximum efficiency.
  • Build and optimize ETL pipelines to support data ingestion, transformation, and delivery across the organization.
  • Collaborate with stakeholders to gather requirements, analyze business processes, and deliver data solutions that drive measurable value.
  • Monitor and enhance the performance of data systems to ensure reliability, scalability, and cost-effectiveness.
  • Define and enforce best practices for data engineering, including data quality, governance, and security.
  • Partner with cross-functional teams—data scientists, analysts, DevOps, and product managers—to align data solutions with business goals.
  • Provide technical leadership and mentorship to junior engineers.
  • Stay current with emerging technologies and recommend tools to enhance data infrastructure and workflows.
  • Ensure compliance with organizational policies, industry standards, and regulatory requirements.

Qualifications

  • Bachelor’s or Master’s degree in Computer Science or a related field, or equivalent experience.
  • Proven experience with Databricks, including ETL development and data migration.
  • Databricks certification(s) strongly preferred.
  • Proficiency in cloud platforms such as AWS, Azure, or GCP.
  • Strong understanding of big data technologies, data warehousing, and data modeling.
  • Expertise in SQL, Python, and scripting languages.
  • Experience with RDBMS systems and transitioning to cloud-native architectures.
  • Familiarity with containerization (e.g., Docker) and orchestration tools (e.g., Kubernetes).
  • Solid grasp of data governance, data quality, and security principles.
  • Excellent problem-solving, communication, and collaboration skills.

Apply Now

Accepted file types are DOC, DOCX, PDF, HTML, and TXT.

Mandatory questions are indicated. All other questions are optional. I agree that any sensitive personal information I voluntarily provide in response to optional questions will be handled in accordance with the Global Privacy Policy. I acknowledge and agree to receive communications from Korn Ferry via phone, SMS and email (message frequency varies, SMS message and data rates may apply). I am not a citizen of, ordinarily resident, or physically located in Cuba, Iran, North Korea, Syria, or the Crimea, Donetsk, or Luhansk regions of Ukraine nor ordinarily resident or physically located in the Russian Federation. I understand that I can withdraw this consent at any time by contacting privacy@kornferry.com.