Talent.com
Collinson Group - DataOps Engineer

Collinson Group - DataOps Engineer

Collinson Loyalty & Benefits Private Limited”Mumbai
26 days ago
Job description

We are seeking a skilled Data Ops Engineer to join our growing data and analytics team. The ideal candidate will be responsible for building, maintaining, and optimizing data pipelines, ensuring data quality, and enabling seamless data flow across the organization.

This role bridges the gap between data engineering, operations, and analytics, contributing to the reliability, scalability, and efficiency of our data platforms.

As a Data Ops Engineer, you will collaborate closely with data engineers, data scientists, and business stakeholders to deploy, monitor, and optimize data workflows, while implementing best practices for automation, testing, and governance.

Key Responsibilities :

  • Design, implement, and maintain robust data pipelines and workflows to support analytics and machine learning initiatives.
  • Automate data ingestion, transformation, and integration processes across multiple sources and platforms.
  • Monitor, troubleshoot, and optimize data pipelines to ensure reliability, scalability, and performance.
  • Implement data validation, quality checks, and error handling procedures to ensure high-quality datasets.
  • Collaborate with data engineers, data scientists, and business teams to understand requirements and deliver actionable data solutions.
  • Manage deployments, versioning, and configuration of data pipelines and related infrastructure.
  • Contribute to CI / CD pipelines and infrastructure-as-code practices for data systems.
  • Establish and maintain monitoring and alerting mechanisms for data workflow health.
  • Participate in data governance initiatives, ensuring compliance with company policies and regulations.
  • Evaluate and adopt emerging tools, technologies, and best practices in data operations.

Required Skills and Qualifications :

  • Bachelors or Masters degree in Computer Science, Information Technology, Data Science, or a related field.
  • 6 - 9 years of experience in data engineering, data operations, or DevOps for data platforms.
  • Strong programming skills in Python, SQL, or Java / Scala for data processing.
  • Hands-on experience with ETL / ELT frameworks, workflow orchestration tools (Airflow, Luigi, Prefect, etc.).
  • Experience with cloud data platforms such as AWS (Glue, Redshift), Azure (Data Factory, Synapse), or GCP (BigQuery, Dataflow).
  • Proficiency with database systems including relational (MySQL, PostgreSQL) and NoSQL (MongoDB, Cassandra) databases.
  • Familiarity with containerization and orchestration tools (Docker, Kubernetes) is a plus.
  • Knowledge of data modeling, warehousing, and analytics concepts.
  • Strong problem-solving skills, attention to detail, and ability to work in a collaborative, fast-paced environment.
  • Experience with monitoring, logging, and alerting frameworks for data workflows.
  • Preferred Skills (Optional) :

  • Experience with big data processing frameworks such as Spark, Hadoop, or Flink.
  • Familiarity with data governance, lineage, and metadata management tools.
  • Knowledge of DevOps practices applied to data pipelines and infrastructure.
  • Exposure to machine learning pipelines and deployment workflows.
  • (ref : hirist.tech)

    Create a job alert for this search

    Engineer Group • Mumbai