Talent.com
Data Pipeline Engineer

Data Pipeline Engineer

Airtel Payments BankHaryāna, Republic Of India, IN
1 day ago
Job description

About Us :

Airtel Payments Bank, India's first payments bank is a completely digital and paperless bank. The bank aims to take basic banking services to the doorstep of every Indian by leveraging Airtel's vast retail network in a quick and efficient manner.

  • At Airtel Payments Bank, we’re transforming the way banking operates in the country. Our core business is banking and we’ve set out to serve each unbanked and underserved Indian. Our products and technology aim to take basic banking services to the doorstep of every Indian. We are a fun-loving, energetic and fast growing company that breathes innovation. We encourage our people to push boundaries and evolve from skilled professionals of today to risk-taking entrepreneurs of tomorrow. We hire people from every realm and offer them opportunities that encourage individual and professional growth. We are always looking for people who are thinkers & doers;
  • people with passion, curiosity & conviction;

people who are eager tobreak away from conventional roles and do 'jobs never done before’.

About the Team :

We are a team of engineers and problem-solvers building scalable, modern data infrastructure. Our mission is to power intelligent decision-making through clean, reliable, and real-time data pipelines using technologies like Kafka, PySpark, Hadoop, Airflow, and AWS. If you love working with data at scale, building cloud-native solutions, and improving pipeline reliability, you'll thrive here.

Key Responsibilities :

  • Design, build, and maintain scalable ETL / ELT pipelines using Python, PySpark, and SQL.
  • Develop real-time data ingestion and streaming solutions using Apache Kafka and AWS Kinesis.
  • Leverage the Hadoop ecosystem and AWS EMR for distributed data processing.
  • Automate and orchestrate workflows using Apache Airflow (deployed via MWAA on AWS).
  • Build and expose data services and APIs using Flask, deployed via Nginx.
  • Implement centralized logging and monitoring with Filebeat and the ELK / Opensearch stack.
  • Work extensively with AWS services including S3, API Gateway, OpenSearch, Kinesis, and EMR.
  • Collaborate with data scientists, analysts, and platform engineers to ensure high-quality, accessible data.
  • Must-Have Skills :

  • 4–7 years of experience in data engineering.
  • Proficiency in Python and SQL (Experience with Oracle or similar RDBMS).
  • Good experience with Apache Kafka (producer / consumer architecture, stream processing concepts).
  • Hands-on with PySpark, Hadoop & Hive for big data processing.
  • Workflow orchestration using Apache Airflow (and optionally MWAA).
  • Building APIs with Flask and serving via Nginx.
  • Logging and observability using ELK Stack and Filebeat.
  • Good familiarity with AWS ecosystem : EMR, Kinesis, S3, OpenSearch, API Gateway, MWAA.
  • Good-to-Have (But not mandatory)

  • Experience with AWS Glue, Athena, and Redshift for serverless data processing and warehousing.
  • Familiarity with AWS Flink or other stream processing frameworks.
  • Exposure to AWS DMS (Data Migration Service) for database migrations and replication tasks.
  • Knowledge of AWS QuickSight for dashboarding and BI reporting.
  • Understanding of data lake architectures and event-driven processing on AWS
  • Why Join Us?

    Airtel Payments Bank is transforming from a digital-first bank to one of the largest Fintech company. There could not be a better time to join us and be a part of this incredible journey than now. We at Airtel payments bank don’t believe in all work and no play philosophy. For us, innovation is a way of life and we are a happy bunch of people who have built together an ecosystem that drives financial inclusion in the country by serving 300 million financially unbanked, underbanked, and underserved population of India. Some defining characteristics of life at Airtel Payments Bank are Responsibility, Agility, Collaboration and Entrepreneurial development : these also reflect in our core values that we fondly call RACE..

    Create a job alert for this search

    Data Pipeline Engineer • Haryāna, Republic Of India, IN

    Related jobs
    • Promoted
    Lead GCP Data Engineer

    Lead GCP Data Engineer

    Impetusharyana, haryana, in
    Lead Data Engineer – GCP (BigQuery • Composer • Python • PySpark).You will lead the design, build and operation of large-scale data platforms on the Google Cloud Platform.You will manage a team of ...Show moreLast updated: 1 day ago
    • Promoted
    Data Engineer

    Data Engineer

    EXLharyana, haryana, in
    We are looking for a Python & PySpark developer and data engineer who can design and build solutions for one of our Fortune 500 Client programs in the realm of Financial Master & Reference Data Man...Show moreLast updated: 1 day ago
    • Promoted
    Data Engineer

    Data Engineer

    KKRharyana, haryana, in
    KKR aims to generate attractive investment returns by following a patient and disciplined investment approach, employing world-class people, and supporting growth in its portfolio companies and com...Show moreLast updated: 1 day ago
    • Promoted
    Data Engineer

    Data Engineer

    Azilen Technologiesharyana, haryana, in
    Manage and create design schema, SQL query tuning, and code review.Min 4+ years of professional experience in the field of data engineering with knowledge of the data platform and DWH development.D...Show moreLast updated: 1 day ago
    • Promoted
    Senior Data Engineer

    Senior Data Engineer

    KOGTA FINANCIAL (INDIA) LIMITEDharyana, haryana, in
    ETL & Data Warehouse Developer.As a key member of our data engineering team, you will be responsible for designing, developing, and optimizing ETL pipelines and scalable data warehouse solutions on...Show moreLast updated: 1 day ago
    • Promoted
    GCP Data Engineer

    GCP Data Engineer

    Impetusharyana, haryana, in
    Design, build, and maintain large-scale data pipelines on BigQuery and other Google Cloud Platform (GCP) services.Use Python and PySpark / Spark to transform, clean, aggregate and prepare data for an...Show moreLast updated: 1 day ago
    • Promoted
    Lead Data Engineer

    Lead Data Engineer

    MakeMyTripharyana, haryana, in
    At MakeMyTrip (MMT), technology is at the heart of everything we do.As a leading player in the travel industry, we leverage cutting-edge solutions like AI, machine learning, and cloud infrastructur...Show moreLast updated: 1 day ago
    • Promoted
    Data Pipeline Engineer

    Data Pipeline Engineer

    GMGHaryāna, Republic Of India, IN
    GMG is a global well-being company retailing, distributing and manufacturing a portfolio of leading international and home-grown brands across sport, food and health sectors.Its vision is to inspir...Show moreLast updated: 1 day ago
    • Promoted
    Data Pipeline Engineer

    Data Pipeline Engineer

    EXLHaryāna, Republic Of India, IN
    We are looking for a Python & PySpark developer and data engineer who can design and build solutions for one of our Fortune 500 Client programs in the realm of Financial Master & Reference Data Man...Show moreLast updated: 1 day ago
    • Promoted
    • New!
    Senior Data Engineer

    Senior Data Engineer

    ManpowerGroup Indiaharyana, haryana, in
    ADF, ETL and SSIS, Python and SQL including data warehousing concepts and data warehousing principles.Proficiency in ETL tools and SQL for data extraction, transformation, and loading.Experience pe...Show moreLast updated: 20 hours ago
    • Promoted
    AI Data Engineer - 17852

    AI Data Engineer - 17852

    Turingharyana, haryana, in
    We’re looking for experienced AI data engineers skilled in Python to collaborate with one of the world’s top Large Language Model (LLM) companies. Your work will directly help improve how AI models ...Show moreLast updated: 1 day ago
    • Promoted
    Data Pipeline Engineer

    Data Pipeline Engineer

    Sirius AIHaryāna, Republic Of India, IN
    Sirius AI is a US headquartered AI Consulting services and products company with operations in India.Sirius AI focuses on Financial Services enterprises and solutions / services delivered across mu...Show moreLast updated: 1 day ago
    • Promoted
    Data Engineer

    Data Engineer

    IGT Solutionsharyana, haryana, in
    We are seeking a highly skilled.Databricks, PySpark, ETL development, and SQL.The ideal candidate will be responsible for designing, developing, and optimizing scalable data pipelines and analytics...Show moreLast updated: 1 day ago
    • Promoted
    Senior Data Pipeline Engineer

    Senior Data Pipeline Engineer

    ImpetusHaryāna, Republic Of India, IN
    Lead Data Engineer – GCP (BigQuery - Composer - Python - PySpark).You will lead the design, build and operation of large-scale data platforms on the Google Cloud Platform.You will manage a te...Show moreLast updated: 1 day ago
    • Promoted
    Data Engineer

    Data Engineer

    Terra Technology Circle Consulting Private Limitedharyana, haryana, in
    We are seeking a highly skilled and motivated.In this role, you will design, build, and optimize scalable data pipelines and architectures to support analytics, machine learning, and business intel...Show moreLast updated: 1 day ago
    • Promoted
    Data Engineer

    Data Engineer

    Confidentialharyana, haryana, in
    A leading global consulting firm based in Gurugram, India is looking for a Data Engineer.You will work alongside consulting teams to demonstrate AI and data tools, empowering them to leverage techn...Show moreLast updated: 1 day ago
    • Promoted
    Principal Data Pipeline Engineer

    Principal Data Pipeline Engineer

    Pacific Data IntegratorsHaryāna, Republic Of India, IN
    Shift time : Open to work in EST shift (5PM to 2AM IST).Lead the design, development, and implementation of complex data integration solutions using Informatica Intelligent Data Management Cloud (ID...Show moreLast updated: 1 day ago
    • Promoted
    Data Engineer

    Data Engineer

    NABharyana, haryana, in
    Proficient in executing ETL processes between data environments.Proficient at coding in SQL (additional knowledge in Python and R is highly regarded). Proficient at creating analytics related docume...Show moreLast updated: 1 day ago