Talent.com
This job offer is not available in your country.
ETL Developer

ETL Developer

NutanixBengaluru, Karnataka, India
7 hours ago
Job description

The Opportunity

We are looking for a highly skilled Data Integration Engineer to design, build, and manage scalable data pipelines and integration solutions across cloud and on-premises platforms. The role requires strong expertise in ETL / iPaaS tools, APIs, and data platforms, with exposure to AI / ML-driven automation for smarter monitoring, anomaly detection, and data quality improvement.

About the Team

At Nutanix, the Data Science team is a dynamic and diverse group of 50 talented professionals spread across our offices in India (Bangalore and Pune) and San Jose. We pride ourselves on fostering a collaborative and supportive environment where innovation thrives. Our team is deeply committed to leveraging data in a results-oriented manner, ensuring our solutions remain customer-centric. We believe in transparency and trust, which allows for open communication and a fluid exchange of ideas. Being agile and adaptable, we embrace diverse perspectives to drive creativity and efficient problem-solving.

Your Role

  • Design, develop, and optimize data integration workflows, ETL / ELT pipelines, and APIs .
  • Work with iPaaS and ETL tools (Informatica) to integrate enterprise systems.
  • Build pipelines across cloud platforms (AWS, Azure, GCP) and modern warehouses (Snowflake, Databricks, BigQuery, Redshift).
  • Implement data quality, lineage, and governance frameworks to ensure reliable data flow.
  • Leverage AI / ML models for data anomaly detection, pipeline monitoring, and predictive quality checks.
  • Contribute to self-healing pipeline design by incorporating AI-driven automation.
  • Collaborate with architects, analysts, and business teams to integrate structured, semi-structured, and unstructured data sources .
  • Document integration patterns, best practices, and reusable frameworks .

What You Will Bring

  • 6–8 years of experience in data integration, ETL / ELT design, and data pipelines .
  • Strong expertise in Informatica, or similar ETL / iPaaS tools .
  • Proficiency in SQL, Python, and automation scripting .
  • Experience with cloud data platforms (Snowflake, Databricks, BigQuery, .
  • Familiarity with data governance practices (cataloging, lineage, DQ frameworks).
  • Exposure to AI / ML concepts applied to data quality and pipeline optimization.
  • Understanding of DevOps / CI-CD pipelines for data integration deployments.
  • Nice-to-Have

  • Hands-on experience with Kafka, Spark, Airflow, or event-driven architectures .
  • Knowledge of REST APIs, microservices, and real-time data integration .
  • Conceptual understanding or hands-on exposure to ML frameworks (Scikit-learn, TensorFlow, PyTorch).
  • Experience contributing to AI-augmented / self-healing pipelines .
  • Education

  • Bachelor’s or Master’s in Computer Science, Data Engineering, Information Systems, or related field
  • Work Arrangement

    Hybrid : This role operates in a hybrid capacity, blending the benefits of remote work with the advantages of in-person collaboration. For most roles, that will mean coming into an office a minimum of 3 days per week, however certain roles and / or teams may require more frequent in-office presence. Additional team-specific guidance and norms will be provided by your manager.

    Create a job alert for this search

    Etl Developer • Bengaluru, Karnataka, India