Job description
Location- Bangalore
Experience- 5-8 Yrs
Overview :
We are seeking a highly skilled
Data Engineer
with strong hands-on experience in
Snowflake
and
dbt (Data Build Tool)
to join our data engineering team. The ideal candidate will be responsible for designing and developing scalable data pipelines, performing advanced data transformations, and ensuring data quality using modern data stack technologies.
Key Responsibilities :
Design, develop, and optimize data pipelines using
dbt
and
Snowflake .
Build efficient, reliable, and scalable data transformation models with
dbt Core or dbt Cloud .
Implement Snowflake features such as
Snowpipe, Streams, Tasks, and Dynamic Tables .
Work closely with Data Analysts and Analytics Engineers, and Business teams to understand data requirements.
Ensure data quality and perform rigorous
data testing and validation
using
dbt tests .
Maintain and enhance the
data warehouse architecture
to support business intelligence and reporting needs.
Monitor data pipeline performance and troubleshoot issues proactively.
Apply
version control practices (Git)
and CI / CD for data workflows.
Strong proficiency in Python. The resource should be comfortable writing production-grade Python code, interacting with APIs to extract and integrate data from various sources, and automating workflows.
Experience with handling large-scale data ingestion, transformation, and processing tasks, ensuring data quality, reliability, and scalability across platforms.
Required Skills & Qualifications :
5+ years of experience in Data Engineering.
Strong hands-on experience with
Snowflake
– including data modeling, performance tuning, and administration.
Advanced proficiency in
dbt
(Core or Cloud) for data transformations and testing.
Proficient in
SQL
(complex queries, CTEs, window functions, optimization).
Experience with
ETL / ELT design patterns
and tools like Apache Nifi, Airflow, and Fivetran.
Solid understanding of
data warehousing concepts, dimensional modeling,
and
medallion architecture.
Experience with AWS
Cloud Platforms
is a must OR experience with other cloud service providers like Azure, or GCP is a plus.
Familiarity with
Git / GitHub
and version-controlled deployment pipelines.
Excellent communication skills and ability to work in cross-functional teams.
Demonstrated ability to thrive in fast-paced environments. The resource should have a strong aptitude to be comfortable diving deep into datasets, identifying patterns, and uncovering data quality issues in environments where data sanity is low.
Data Engineer Snowflake • India