Role : Snowflake Data Engineer
Mode of Working : Remote
Working Hours : 2 : 00 PM 11 : 00 PM IST
Experience Required : 7 to 13 years
Job Description :
Responsible for designing, developing, and maintaining data solutions within the Snowflake cloud data platform, with a strong emphasis on ETL / ELT processes involving S4 HANA to Snowflake, advanced SQL and Python scripting, and modern data warehousing.
Key Responsibilities :
- Design, develop, and maintain data solutions using the Snowflake cloud data platform.
- Build robust ETL / ELT pipelines from S4 HANA to Snowflake.
- Write and optimize complex SQL queries, stored procedures, and user-defined functions (UDFs) for analytics and reporting needs, while adhering to best practices for readability and performance.
- Implement and maintain data models within Snowflake, leveraging advanced data warehousing and ETL concepts.
- Work with Snowflake, SQL, Python scripting, and integrate with AWS and / or Azure cloud services.
- Utilize SNP Glue for building and maintaining data integration pipelines.
- Collaborate with stakeholders and technical teams to gather requirements, manage projects, and deliver scalable solutions.
- Incorporate leading practices in data fabrics and cloud-based architectures and apply knowledge of AI solutions where beneficial.
- Integrate Power BI for reporting and visualization.
Required Skills and Experience :
Deep expertise in Snowflake, advanced SQL, and data modelling best practices.Extensive experience in ETL / ELT processes, especially involving S4 HANA to Snowflake.Proficiency in Python scripting, especially its use with Snowflake.Experience working with AWS and / or Azure cloud data services, with knowledge of data fabrics and related cloud solutions.Proven experience with SNP Glue integration.Knowledge of AI solutions is an added advantage.Familiarity with Power BI for data visualization and reporting.(ref : hirist.tech)