Job Opportunity : Snowflake + DBT Engineer
Location : Pune ,Remote
Experience : 8+ Years
Job Description
We are seeking a highly skilled Snowflake + DBT Engineer to design, build, and optimize scalable cloud-based data platforms. The ideal candidate will have strong expertise in Snowflake architecture, ELT processes, and DBT-based data transformation frameworks. This role requires hands-on technical proficiency, strategic thinking, and the ability to collaborate with cross-functional teams to deliver high-quality data solutions.
- Key ResponsibilitiesArchitect, develop, and optimize end-to-end Snowflake data warehouse solutions.
- Design, implement, and maintain DBT models, transformations, and reusable data frameworks.
- Build high-performance SQL queries, schemas, and pipelines with a focus on optimization and scalability.
- Partner with data engineering teams to automate and version-control workflows using DBT and modern DevOps practices.
- Ensure data quality, documentation, and lineage tracking across all layers of the platform.
- Collaborate with analytics, product, and engineering teams to enforce data governance, security, and best practices.
- Required Skills & Qualifications8+ years of experience in Data Engineering with deep hands-on expertise in Snowflake.
- Strong proficiency in DBT : data modeling, macros, tests, documentation, and workflow orchestration.
- Expert-level SQL skills with strong understanding of ELT / ETL design principles.
- Experience working with cloud ecosystems such as AWS, GCP, or Azure.
- Strong understanding of data architecture, security, governance, and performance tuning.
- Ability to work independently, solve complex technical problems, and deliver high-quality solutions.