Key Responsibilities
- Design, develop, and maintain scalable ETL / ELT pipelines in Snowflake to support data migration from legacy systems.
- Leverage Python for data transformation, automation, and orchestration of migration workflows.
- Optimize and refactor complex SQL queries to ensure efficient data processing and reporting in Snowflake.
- Collaborate on data modeling and schema design to align with Snowflake architecture and performance best practices.
- Monitor and troubleshoot data pipeline performance during and after migration phases.
- Work closely with data analysts, scientists, and business stakeholders to ensure accurate and timely data delivery.
- Implement and enforce data governance, security policies, and access controls within Snowflake.
- Collaborate with DevOps teams to integrate data engineering workflows into broader CI / CD frameworks.
Required Skills
4–5 years of experience in data engineering, with proven expertise in Snowflake and Python.Strong command of Snowflake features such as scripting, time travel, virtual warehouses, and query optimization.Hands-on experience with ETL tools, data integration strategies, and migration methodologies.Solid understanding of data warehousing principles, normalization techniques, and performance optimization.Familiarity with cloud platforms (AWS, Azure, or GCP) and orchestration tools.Excellent problem-solving skills and ability to work independently in a dynamic, fast-paced environment.Skills Required
Azure, snowflake , Aws, Etl, Data Integration, Python, Data Warehousing, Gcp