Our client is in search of a Data Engineer that will work with a team of developers to plan, deliver, and iterate on software that provides data solutions for our analytical needs. The ideal candidate is expected to exercise a high degree of independence, initiative, and professional expertise in creating and maintaining software that serves both our customers and our business.
Duties:
- Develop software used in data processing pipelines written in Python and orchestrated through Apache Airflow
- Modify data infrastructure and architecture in support of strategic initiatives
- Manage integration of data across a wide variety of internal and external RESTful APIs and data sources
- Model, implement and maintain a variety of data sets used in reporting and analytics
- Iterate and maintain on star schema data warehouse models
- Use unit, integration and acceptance testing principles to drive quality
- Perform professional code reviews within your team
Desired Skills and Experience:
- 5+ years of experience as a developer preferred
- Expert in software development concepts and best practices
- Excellent Python programming skills
- Competent in SQL, data modeling and implementing and using database technologies such as MySQL or Postgres
- Solid understanding of modern data architectures
- Solid understanding and experience in unit and acceptance testing
- Solid usage of version control, and corresponding workflows for managing software changes
- Familiar with data warehousing, ETL/ELT
- Familiar with Tableau, Apache Airflow, Amazon Redshift or other similar technologies
- Experience with AWS services such as S3, Glue, Athena, and Kinesis is a plus
- Familiar with industry-standard Python software frameworks
- Experience using automated deployment pipeline processes and related technology