We have partnered with our client in their search for a Big Data Developer.
Responsibilities
You will be part of a team responsible for developing, enhancing, and supporting all technical solutions associated with Greenfield Data Platform build out
Write and support complex and original software code
Participate in all aspects of product development and rollout, including design, development, debugging implementation, documentation and support
Assist developing specification, program, and documentation standards; assist maintaining functional operation of assigned production systems
Embrace challenges and learn new skills; be a catalyst for change
Gain knowledge of the business side of finance and risk management
Learn to write and maintain system design documents
Communicate status with project managers
Learn to estimate task effort and timeframes
Skills Required
Java and C# or Python development experience
Excellent experience working with Azure platform and tools like Azure Data Factory, Azure Databricks, Synapse
Data Flow Processes
SQL Development
ETL
Working experience with data modeling, relational modeling and dimensional modeling
Working knowledge of source code control tool such as GIT
Experience in implementing data management/data catalog, DQ tools, REST APIs is a plus
Implementation experience in managing and working in multiple environments, release and change management and knowledge of firewall, network work protocols, file transfer – TIBCO
Familiar with Agile development methodologies
Sound to advanced knowledge of business, standards, infrastructure, architecture and technology from a design/support/ solutions perspective
Readiness and motivation (as an experienced developer and subject matter expert) to address and resolve complex issues, guide/advise/support clients, partners and project teams, often working on multiple medium-to-large sized projects.