Position : Snowflake / Data Vault 2.0 Developer
Location : Remote –EST hours
Length : 4+ months
Notes :
Project involves implementing Data Vault 2.0 and building from the ground up.
Work includes specific pipeline development in Snowflake with Streamlit.
Not a traditional star schema – approach is based on Snowflake methodology and Data Vault 2.0.
Replication approach required across environments (development, production, data structure).
Must be able to integrate and code along the way to ensure AI functionality at the end.
Extra steps and integrations involved – requires someone experienced in Snowflake and Data Vault 2.0.
We are seeking a Snowflake-focused Data Engineer with deep expertise in IoT pipelines, ML integration, and containerized applications on Snowpark Container Services (SPCS) . This is not a duplicate of EDM’s function : the role embeds applied engineering inside our department, ensuring we can move faster on ML, containerized applications, and complex pipeline orchestration.
Key Responsibilities
1. Snowflake IoT Data Pipelines (Batch + Streaming)
2. Machine Learning Operations (Snowpark + Cortex AI)
3. Containerized Applications on Snowflake SPCS
4. Collaboration & EDM Alignment
Snowflake Developer • Ranchi, Jharkhand, India