We are seeking an experienced Senior Data Developer to join our data engineering team responsible for building and maintaining complex data solutions using Azure Data Factory (ADF), Azure Databricks , and Cosmos DB . The role involves designing and developing scalable data pipelines, implementing data transformations, and ensuring high data quality and performance. The Senior Data Developer will work closely with data architects, testers, and analysts to deliver robust data solutions that support strategic business initiatives.
The ideal candidate should possess deep expertise in big data technologies, data integration, and cloud-native data engineering solutions on Microsoft Azure. This role also involves coaching junior developers, conducting code reviews, and driving strategic improvements in data architecture and design patterns.
Key Responsibilities
- Data Solution Design and Development :
- Design and develop scalable and high-performance data pipelines using Azure Data Factory (ADF).
- Implement data transformations and processing using Azure Databricks.
- Develop and maintain NoSQL data models and queries in Cosmos DB.
- Optimize data pipelines for performance, scalability, and cost efficiency.
- Data Integration and Architecture :
- Integrate structured and unstructured data from diverse data sources.
- Collaborate with data architects to design end-to-end data flows and system integrations.
- Implement data security, governance, and compliance standards.
- Performance Tuning and Optimization :
- Monitor and tune data pipelines and processing jobs for performance and cost efficiency.
- Optimize data storage and retrieval strategies for Azure SQL and Cosmos DB.
- Collaboration and Mentoring :
- Collaborate with cross-functional teams including data testers, architects, and business analysts.
- Conduct code reviews and provide constructive feedback to improve code quality.
- Mentor junior developers, fostering best practices in data engineering and cloud development.
Primary Skills
Data Engineering : Azure Data Factory (ADF), Azure Databricks.Cloud Platform : Microsoft Azure (Data Lake Storage, Cosmos DB).Data Modeling : NoSQL data modeling, Data warehousing concepts.Performance Optimization : Data pipeline performance tuning and cost optimization.Programming Languages : Python, SQL, PySparkSecondary Skills
DevOps and CI / CD : Azure DevOps, CI / CD pipeline design and automation.Security and Compliance : Implementing data security and governance standards.Agile Methodologies : Experience in Agile / Scrum environments.Leadership and Mentoring : Strong communication and coaching skills for team collaboration.Soft Skills
Strong problem-solving abilities and attention to detail.Excellent communication skills, both verbal and written.Effective time management and organizational capabilities.Ability to work independently and within a collaborative team environment.Strong interpersonal skills to engage with cross-functional teams.Educational Qualifications
Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field.Relevant certifications in Azure and Data Engineering, such as :Microsoft Certified : Azure Data Engineer AssociateMicrosoft Certified : Azure Solutions Architect ExpertDatabricks Certified Data Engineer Associate or ProfessionalSkills Required
Azure Data Factory, Pyspark, Cosmos DB, Data Integration, Python, Performance Tuning