Key Responsibilities
- Design, build, and maintain scalable ETL data pipelines using Azure Data Factory to handle large volumes of structured and unstructured logistics data.
- Collaborate with stakeholders to develop efficient data models and architecture supporting analytics and reporting needs. Ensure data integrity and consistency across the data lifecycle.
- Implement data validation and quality checks to detect and resolve data anomalies and errors. Work closely with business users to address data issues.
- Integrate data from external systems, vendors, and partners via APIs or other methods; develop and maintain seamless data interfaces.
- Optimize data processing and query performance to ensure timely availability of analytics data. Monitor and fine-tune pipelines for reliability and efficiency.
- Document data engineering processes, models, and technical specifications; promote knowledge sharing and best practices within the team.
Skills Required
Azure Data Factory, Python, Sql, Etl, snowflake