Job Title
Data Architect - Data Warehousing, Analytics & AI Enablement
Overview
We are seeking an experienced Data Architect with a strong data warehousing background to help design, build, and evolve a centralized enterprise data platform. This role is heavily focused on data warehouse architecture, modeling, and optimization, with the additional responsibility of enabling advanced analytics and AI?driven use cases.
The ideal candidate is deeply experienced in modern data warehousing concepts and understands how to structure large, complex datasets so they can be reliably consumed by reporting tools, analytics platforms, and emerging AI technologies.
Key Responsibilities
- Architect and maintain a centralized enterprise data warehouse that consolidates data from multiple applications and source systems.
- Design and optimize data warehouse schemas, including dimensional and analytical models, to support scalable reporting and analytics.
- Own and evolve ETL/ELT pipelines that ingest, transform, and load data into the warehouse with a focus on performance, reliability, and data quality.
- Develop and maintain a semantic layer that clearly defines data meaning, relationships, and metrics for analytics and AI consumption.
- Partner with engineering, analytics, and data science teams to ensure the data warehouse supports both current and future use cases.
- Enable AI-driven and natural?language analytics by ensuring warehouse data is structured, well?modeled, and consistently defined.
- Establish best practices for data warehousing, including governance, documentation, and scalability considerations.
- Troubleshoot and improve warehouse performance, data freshness, and downstream data usability.
Required Skills & Experience
- Extensive experience designing and supporting enterprise data warehouses
- Strong background in data modeling (dimensional, analytical, or semantic models)
- Proven experience building and maintaining ETL/ELT pipelines
- Hands?on experience with modern data warehouse or data lake architectures
- Proficiency in Python for data transformation and pipeline development
- Strong understanding of how data warehouses support analytics, reporting, and BI tools
- Experience working with large-scale, multi-source datasets
- Ability to translate business and analytical needs into scalable warehouse designs
Preferred / Nice-to-Have Qualifications
- Experience with enterprise ETL platforms (e.g., Informatica or similar tooling)
- Exposure to AI, machine learning, or LLM-enabled analytics use cases
- Experience designing semantic layers for analytics or AI consumption
- Background in software design or engineering
- Familiarity with Java?based ecosystems (helpful but not required)
- Experience operating data platforms in cloud environments
About Korn Ferry
Korn Ferry unleashes potential in people, teams, and organizations. We work with our clients to design optimal organization structures, roles, and responsibilities. We help them hire the right people and advise them on how to reward and motivate their workforce while developing professionals as they navigate and advance their careers. To learn more, please visit Korn Ferry at www.Kornferry.com