Talent.com
This job offer is not available in your country.
▷ Apply in 3 Minutes : Data Engineer

▷ Apply in 3 Minutes : Data Engineer

Sonata SoftwareIndia
17 hours ago
Job description

Job Description Summary

We are seeking a highly skilled Data Engineer to join our growing team. The ideal candidate has strong experience building and maintaining robust, scalable, cloud-native data pipelines and Datawarehouse using tools such as Snowflake, Five Tran, Airflow, and DBT. You will work closely with data analysts, scientists, and engineering teams to ensure reliable, timely, and secure data delivery.

Key Responsibilities

  • Design, develop, and maintain batch and streaming data pipelines to load DataMart’s.
  • Implement scalable data transformations using Snowflake stored procedures and orchestrate workflows via Airflow or equivalent tools.
  • Integrate with data platforms such as Snowflake, ensuring efficient data storage and retrieval.
  • Write optimized SQL and Python scripts for data manipulation and ETL processes.
  • Maintain data quality, observability, and pipeline reliability through monitoring and alerting.
  • Collaborate with analytics and business teams to deliver high-impact data solutions.
  • Adhere to best practices for version control, documentation, and CI / CD in a collaborative environment.

Qualifications

  • Bachelor’s degree in information technology or related field; or an equivalent combination of education and experience sufficient to successfully perform the key accountabilities of the job required.
  • Experience with data ingestion and orchestration tools like Five Tran, Airflow, Python
  • Exposure and good understanding of D365 ERP data
  • Prior experience working in fast-paced product or analytics teams.
  • Experience

  • 5+ years of hands-on experience in data engineering.
  • Strong experience with :
  • Snowflake or similar cloud data warehouses.
  • Airflow or other orchestration tools.
  • SQL and Python.
  • Strong hands-on experience in building transformation pipelines using python, Airflow and Snowflake Stored procedures
  • Write optimized SQL and Python scripts for data manipulation and ETL processes.
  • Maintain data quality, observability, and pipeline reliability through monitoring and alerting.
  • Hands-on experience with AWS, Azure, or GCP services.
  • Good understanding of data architecture, security, and performance tuning.
  • Familiarity with version control (e.g., Git), CI / CD tools, and agile workflows.
  • Create a job alert for this search

    Data Engineer • India