Key Responsibilities :
- Design, develop, and maintain data transformation pipelines using dbt / IICS on Snowflake.
- Write optimized SQL and Python scripts for complex data modeling and processing tasks.
- Collaborate with data analysts, engineers, and business teams to implement scalable ELT workflows.
- Create and manage data models, schemas, and documentation in dbt.
- Optimize Snowflake performance using best practices (clustering, caching, virtual warehouses).
- Manage data integration from data lakes, external systems, and cloud sources.
- Ensure data quality, lineage, version control, and compliance across all environments.
- Participate in code reviews, testing, and deployment activities using CI / CD pipelines.
Required Skills :
4-6 years of experience in Data Engineering or Data Platform Development.Hands-on experience with Snowflake data warehousing, architecture, and performance tuning.Proficient in dbt (Data Build Tool) model creation, Jinja templates, macros, testing, and documentation.Hands-on experience in creating mapping and workflows in IICS and have extensive experience in performance tuning and troubleshooting activitiesStrong Python scripting for data transformation and automation.Advanced skills in SQL writing, debugging, and tuning queries.Experience with Data Lake and Data Warehouse concepts and implementations.Familiarity with Git-based workflows and version control in dbt projects.Skills Required
Sql, Python, Git, Debugging, snowflake