Data Engineer – Snowflake
Bengaluru / Coimbatore
Employment Type : Full-Time
About the Role
We are seeking a highly skilled Snowflake Data Engineer with expertise in DBT, Airbyte, and Airflow, combined with strong experience in Kimball dimensional modeling. The ideal candidate will design and implement scalable data pipelines, integrate diverse data sources, and build a robust data warehouse that supports business intelligence and analytics initiatives.
Key Responsibilities
Data Integration & Extraction
o Develop and maintain ETL / ELT pipelines using Airbyte and Airflow.
o Extract data from multiple sources including APIs, direct database connections, and flat files.
o Ensure data quality, consistency, and reliability across all ingestion processes.
o Design and implement Kimball-style dimensional models in Snowflake.
o Build and optimize fact tables and dimension tables to support analytical workloads.
o Collaborate with business teams to define and maintain the BUS Matrix for subject areas.
Transformation & Orchestration
o Use DBT to develop modular, testable, and version-controlled transformations.
o Implement data quality checks and documentation within DBT workflows.
Collaboration & Governance
o Work closely with business stakeholders to understand requirements and translate them into technical solutions.
o Ensure compliance with data governance, security, and privacy standards.
Required Skills & Qualifications
Technical Expertise
o Strong proficiency in Snowflake architecture and performance tuning.
o Hands-on experience with DBT, Airbyte, and Airflow.
o Solid understanding of Kimball methodology for data warehousing.
o Advanced SQL skills and familiarity with Python for ETL scripting.
o Experience integrating data from APIs and relational databases.
o Excellent communication and collaboration skills.
o Ability to work in an agile environment and manage multiple priorities.
Preferred Qualifications
Snowflake • Junagadh, IN