Position : Sr. Data Architect – SNOWFLAKE
Duration : 6 months
Budget : 2 Lakhs per month - Fixed
Location : Remote
Experience : 8–10 Years
Timing : 12 PM to 9 PM (with a few hours overlap in EST time zone – mandatory)
Notice Period : Immediate
Project Duration : 6 Months - Extendable
Job Description
- A minimum of 8–10 years of experience in data engineering, encompassing the development and scaling of data warehouse and data lake platforms.
- Working hours : 8 hours, with overlap during EST time zone meetings (mandatory).
Responsibilities
Lead the design and architecture of data solutions leveraging Snowflake, ensuring scalability, performance, and reliability.Collaborate with stakeholders to understand business requirements and translate them into technical specifications and data models.Develop and maintain data architecture standards, guidelines, and best practices, including data governance principles and DataOps methodologies.Oversee the implementation of data pipelines, ETL processes, and data governance frameworks within Snowflake environments.Provide technical guidance and mentorship to data engineering teams, fostering skill development and knowledge sharing.Conduct performance tuning and optimization of Snowflake databases and queries.Stay updated on emerging trends and advancements in Snowflake, cloud data technologies, data governance, and DataOps practices.Primary Skills
Extensive experience in designing and implementing data solutions using Snowflake and DBT.Proficiency in data modeling, schema design, and optimization within Snowflake environments.Strong understanding of cloud data warehousing concepts and best practices, particularly with Snowflake.Expertise in Python / Java / Scala, SQL, ETL processes, and data integration techniques, with a focus on Snowflake.Familiarity with cloud platforms and data technologies (AWS, Azure, GCP).Experience implementing data governance frameworks and DataOps practices.Working experience in SAP environments.Familiarity with real-time streaming technologies and Change Data Capture (CDC) mechanisms.Proven track record of architecting and delivering complex data solutions in cloud platforms / Snowflake.Secondary Skills (Optional)
Experience with data visualisation tools (e.g., Tableau, Power BI).Knowledge of data security and compliance standards.Excellent communication and presentation skills, able to convey complex technical concepts to juniors and non-technical stakeholders.Strong problem-solving and analytical skills; ability to work effectively in a collaborative team environment and lead cross-functional initiatives.Certifications (Desirable but not Mandatory)
Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field.Snowflake certifications (e.g., SnowPro Core, SnowPro Advanced Architect, SnowPro Advanced Data Engineer).