Job Role : Sr. Data Architect
Experience : Min 10+ years
Budget : Max 30LPA
Notice period : Immediate to 30 Days
Location : Trivandrum / Kochi
Job Description
- A minimum of 10 years of experience in data engineering, encompassing the development and
- scaling of data warehouse and data lake platforms.
- The candidate should possess a strong background in Snowflake, demonstrating leadership in
- technical design, architecture, and implementation of complex data solutions.
- Working hours - 8 hours , with a few hours of overlap during EST Time zone. This overlap hours is
- mandatory as meetings happen during this overlap hours.
- Working hours will be 12 PM - 9 PM
Responsibilities include :
Mandatory Skills :
Snowflake experiance, Data Architecture experiance, ETL process experiance, Large Data migrationsolutioning experianceLead the design and architecture of data solutions leveraging Snowflake, ensuring scalability,performance, and reliability.Oversee the implementation of data pipelines, ETL processes, and data governance frameworkswithin Snowflake environments. Ensure efficient data extraction from SAP BW / ECC systemsCollaborate with stakeholders to understand business requirements and translate them intotechnical specifications and data models.Develop and maintain data architecture standards, guidelines, and best practices, including datagovernance principles and DataOps methodologies.Oversee the implementation of data pipelines, ETL processes, and data governance frameworkswithin Snowflake environments.Provide technical guidance and mentorship to data engineering teams, fostering skill developmentand knowledge sharing.Conduct performance tuning and optimization of Snowflake databases and queries.Stay updated on emerging trends and advancements in Snowflake, cloud data technologies, datagovernance, and DataOps practices.
Primary Skills :
Extensive experience in designing and implementing data solutions using Snowflake. DBT,Proficiency in data modeling, schema design, and optimization within Snowflake environments.Strong understanding of cloud data warehousing concepts and best practices, particularly withSnowflake.Expertise in python / java / scala, SQL, ETL processes, and data integration techniques, with a focuson Snowflake.Familiarity with other cloud platforms and data technologies (e.g., AWS, Azure, GCP )Demonstrated experience in implementing data governance frameworks and DataOps practices.Familiarity with realtime streaming technologies and Change Data Capture (CDC) mechanisms.Knowledge of data governance principles and DataOps methodologiesProven track record of architecting and delivering complex data solutions in cloud platforms /Snowflake.Secondary Skills :
Understanding of SAP BW / ECC systems, including data extraction, transformation, and loading(ETL / ELT) processesExperience with data visualization tools (e.g., Tableau, Power BI) is a plus.Experience in designing, developing, and implementing SAP ABAP programs, reports, interfaces,and enhancementsWorking experience in SAP environmentsKnowledge of data security and compliance standardsProficiency in SAP ABAP (Advanced Business Application Programming) development.Excellent communication and presentation skills, with the ability to convey complex technicalconcepts to juniors, non-technical stakeholders.Strong problem-solving and analytical skills - Ability to work effectively in a collaborative teamenvironment and lead cross-functional initiatives.