Talent.com
This job offer is not available in your country.
X.Arterian - Senior Data Engineer - Data Build Tool

X.Arterian - Senior Data Engineer - Data Build Tool

X.ArterianDelhi, IN
30+ days ago
Job type
  • Remote
Job description

Shift time : 4 : 30PM - 1 : 30AM IST

Experience Required : 7+ years in Data Engineering, with strong expertise in DBT

About the Role :

We are seeking a Senior Data Engineer with deep experience in DBT (Data Build Tool) to join our data team. You will be responsible for building scalable and maintainable data pipelines, transforming raw data into actionable insights, and helping shape the future of our data architecture and governance practices.

Key Responsibilities :

  • Design, develop, and maintain data pipelines using DBT, SQL, and orchestration tools like Airflow or Prefect
  • Collaborate with data analysts, scientists, and stakeholders to understand data needs and deliver clean, well-modeled datasets
  • Optimize DBT models for performance and maintainability
  • Implement data quality checks, version control, and documentation standards in DBT
  • Work with cloud data warehouses like Snowflake, BigQuery, Redshift, or Databricks
  • Own and drive best practices around data modeling (Kimball, Star / Snowflake schemas), transformation layers, and CI / CD for data
  • Collaborate with cross-functional teams to integrate data from various sources (APIs, third-party tools, internal services)
  • Monitor and troubleshoot data pipelines and ensure timely delivery of data to business stakeholders
  • Mentor junior engineers and contribute to team growth and development

Required Skills :

  • 7+ years of experience in Data Engineering or related fields
  • 4+ years of hands-on experience with DBT (Core or Cloud)
  • Strong SQL skills and experience with modular data modeling
  • Experience with ELT / ETL pipelines using orchestration tools like Airflow, Dagster, Prefect, or similar
  • Solid understanding of data warehouse architecture and performance tuning
  • Proficient with one or more cloud platforms : AWS, GCP, or Azure
  • Familiarity with version control (Git), CI / CD pipelines, and testing frameworks in data engineering
  • Experience working with structured, semi-structured (JSON, Parquet) data
  • Excellent communication and documentation skills
  • Preferred Qualifications :

  • Experience with DataOps practices and monitoring tools
  • Familiarity with Python or Scala for data processing
  • Exposure to Looker, Tableau, or other BI tools
  • Knowledge of data governance, cataloguing, or lineage tools (e.g., Great Expectations, Monte Carlo, Atlan)
  • (ref : hirist.tech)

    Create a job alert for this search

    Senior Data Engineer • Delhi, IN