About the Role :
We are looking for a skilled Data Engineer with deep expertise in Azure cloud technologies to design, build, and maintain robust data pipelines and analytics solutions. You will be responsible for ingesting, transforming, and managing large-scale data assets while ensuring data quality, security, and scalability in cloud environments.
Key Responsibilities :
- Design, develop, and maintain scalable data pipelines using Azure Data Factory (ADF) to ingest batch and streaming data from various sources.
- Build and orchestrate complex ETL / ELT workflows involving REST APIs (REST, OAuth, SOAP), file ingestion, and data integration into Azure Data Lake Storage Gen2 and Azure Synapse Analytics.
- Implement robust data ingestion solutions handling semi-structured data formats such as Parquet, JSON, and other complex data types.
- Optimize data pipelines for performance, reliability, and scalability to handle very large volumes of log and transactional data.
- Develop data migration strategies and execute seamless data transfers between on-premise systems and cloud environments.
- Create and manage data models within Azure Synapse Analytics and data warehouses to facilitate secure, high-performance data access.
- Collaborate with data analysts and business stakeholders to understand analytical requirements and translate them into data solutions.
- Ensure data quality, consistency, and integrity across pipelines and datasets.
- Utilize Azure Data Lake, Delta Lake, Blob Storage, and other storage solutions to efficiently store and manage large datasets.
- Implement data governance and security best practices to safeguard sensitive information.
- Build and maintain CI / CD pipelines using Azure DevOps for automated deployment and monitoring of data workflows.
- Automate repetitive tasks and build self-healing data pipelines to minimize manual intervention.
- Stay updated on the latest Azure data technologies, trends, and best practices.
- Proactively identify bottlenecks and technical challenges, proposing and implementing innovative solutions.
- Contribute to building a scalable, robust, and maintainable data engineering practice within the organization.
Required Technical Skills :
Expert proficiency in SQL and Azure Data Factory (ADF) is mandatory.Strong hands-on experience building data pipelines and orchestrating ETL / ELT workflows in Azure Data Factory.Experience integrating with REST APIs (REST, OAuth, SOAP) for data ingestion.Proficient in handling batch and streaming data ingestion with complex transformations.Skilled in processing and manipulating semi-structured data formats such as Parquet, JSON, and nested complex data types.Experience working with Azure Synapse Analytics and data warehousing concepts.Proficient in Azure Data Lake Storage Gen2, Delta Lake, and Blob Storage management.Strong understanding of data modeling, optimization, and best practices for large-scale data solutions.Experience working with large volumes of log and event data to generate meaningful analytical insights.(ref : hirist.tech)