Industry & Sector : Cloud Data Engineering and Enterprise Analytics — building scalable data warehouses, ELT pipelines and analytics platforms for enterprise customers across finance, retail, and technology. The team delivers production-grade data products that power BI, reporting, and advanced analytics.
Primary title : Snowflake Data Engineer (Snowflake Developer). Location : India — On-site.
Role & Responsibilities
- Design and implement scalable ELT pipelines on Snowflake to ingest, transform, and curate enterprise data for analytics and reporting.
- Develop robust SQL and Python-based routines for data transformation, orchestration and automation using dbt and Airflow.
- Implement Snowflake features (Snowpipe, Streams, Tasks, Time Travel, Cloning) to enable real-time / near-real-time ingestion and CDC patterns.
- Optimize Snowflake performance and cost through clustering keys, partitioning, query profiling and resource monitor strategies.
- Collaborate with data engineers, analytics and product teams to translate business requirements into data models, schemas and pipeline SLAs.
- Establish CI / CD, version control and observability for data pipelines including testing, deployment pipelines and monitoring alerts.
Skills & Qualifications
Must-Have
SnowflakeSQLPythondbtApache AirflowSnowpipePreferred
AWSGitTerraformQualifications
Proven experience building production ELT / ETL pipelines using Snowflake in enterprise environments.Strong understanding of data modeling (star / snowflake schemas), CDC patterns and data governance best practices.Experience implementing monitoring, alerting and cost controls for cloud data platforms.Benefits & Culture Highlights
Opportunity to work on enterprise-scale data platforms and modern cloud data tooling.Collaborative, engineering-driven culture with emphasis on automation, code quality and observability.Fast-paced projects that impact analytics, reporting and data-driven product decisions across customers.To apply, please highlight your Snowflake implementations, performance optimizations, and any dbt or Airflow projects in your CV. This role is best suited for hands-on engineers who enjoy ownership of end-to-end data pipelines and drive measurable improvements in platform performance and cost.
Skills : python,dbt,git,sql,aws,snowflake
Skills Required
Apache Airflow, Git, Terraform, snowflake , dbt, Snowpipe, Python, Sql, Aws