Senior Python Data Engineer – ETL & Pipeline Development
Experience : 5 to 12 Years
Location : Remote (WFH) / Bangalore / India
Mode of Engagement : Full-time
No. of Positions : 5
Educational Qualification : B.E / B.Tech / M.Tech / MCA / Computer Science / IT
Industry : IT / Data / AI / LegalTech / Enterprise Solutions
Notice Period : Immediate Joiners Preferred
What We Are Looking For :
- 5–12 years of hands-on experience building end-to-end ETL pipelines and data workflows from scratch (Python + SQL).
- Strong expertise in Python for data transformation, automation, and orchestration (Pandas, Airflow, custom schedulers).
- Ability to write complex SQL queries, optimize performance, and design scalable database schemas.
- Experience integrating and consuming RESTful APIs for ingestion, transformation, and delivery pipelines.
- Working knowledge of AWS / GCP / Azure (S3, Lambda, EC2, BigQuery, etc.).
- Hands-on exposure to Docker, Kubernetes, and CI / CD for deploying ETL & data workloads.
- Experience building or working with AI-driven / automated data pipelines (preferred).
Responsibilities :
Design, architect, and build complete ETL pipelines for multi-system data extraction, cleaning, transformation, and loading.Develop custom Python-based data automation frameworks to improve quality and speed.Build and optimize SQL queries, tables, and database structures for analytics & AI use cases.Create and manage API-based data ingestion / integration workflows.Deploy and maintain workloads using Docker, Kubernetes, and CI / CD tools.Collaborate with product, AI, and analytics teams to deliver intelligent data solutions.Set up data validation, monitoring, logging, and alerting mechanisms for production pipelines.Continuously optimize pipelines for performance, cost, scalability, and reliability.Qualifications :
Bachelor’s or Master’s degree in Computer Science / Engineering / IT.Expertise in Python (FastAPI / Flask / Django), SQL, and REST API development.Strong understanding of data modeling, ETL concepts, and pipeline orchestration (Airflow / Prefect / Dagster).Experience with AWS (S3, Lambda, EC2, Glue, Athena) or GCP / Azure equivalents.Hands-on experience with Docker, Kubernetes, and Git-based CI / CD workflows.Strong debugging, problem-solving, and analytical capabilities with ownership mindset.