We are seeking an experienced Snowflake Data Engineer with 4 to 5 years of expertise in designing, building, and optimizing data pipelines and data warehouses using Snowflake. The candidate will collaborate with data architects, analysts, and other engineers to deliver scalable and efficient data solutions.
Key Responsibilities :
- Design, develop, and maintain scalable data pipelines and ETL / ELT workflows using Snowflake
- Optimize Snowflake data warehouse performance including query tuning and cost optimization
- Develop complex SQL queries, stored procedures, and UDFs within Snowflake
- Collaborate with data architects and business teams to understand data requirements and translate them into technical solutions
- Implement data ingestion, transformation, and integration from various sources into Snowflake
- Monitor data quality and ensure data integrity and security within the Snowflake environment
- Automate data pipeline workflows using orchestration tools like Apache Airflow, Talend, or others
- Develop and maintain documentation for data models, pipelines, and processes
- Stay updated with Snowflake features, best practices, and emerging data engineering trends
- Troubleshoot and resolve production issues related to Snowflake and data workflows
Required Skills :
4 to 5 years of hands-on experience with Snowflake Data Warehouse platformStrong proficiency in SQL and experience writing complex queries, stored procedures, and scripts in SnowflakeExperience with ETL / ELT tools such as Apache Airflow, Talend, Informatica, or similarKnowledge of data modeling concepts and best practices in data warehousingExperience with data ingestion techniques from various data sources (flat files, databases, APIs)Familiarity with cloud platforms like AWS, Azure, or Google Cloud Platform where Snowflake is deployedUnderstanding of data governance, security, and compliance principlesExperience with scripting languages such as Python or Shell for automation tasksFamiliarity with version control systems (Git) and CI / CD pipelinesPreferred Qualifications :
Certification in Snowflake or relevant cloud data platformsExperience with big data technologies like Hadoop, Spark, or KafkaKnowledge of BI tools such as Tableau, Power BI, or Looker for data visualizationFamiliarity with Agile methodologies and DevOps practicesSoft Skills :
Strong analytical and problem-solving abilitiesEffective communication skills and ability to collaborate with cross-functional teamsDetail-oriented and quality-focusedSelf-driven and adaptable to evolving technologiesAbility to manage priorities and meet deadlinesSkills Required
Sql, Aws, Azure, Apache Airflow, Talend