Company : AA GLOBUSDIGITAL INDIA PRIVATE LIMITED
AA GLOBUSDIGITAL INDIA PRIVATE LIMITED, is a wholly owned subsidiary of Globus Systems Inc US,
Globus Systems was founded by industry executives who have been part of the IT services industry for the past 20 years and have seen it evolve and mature. We understand the challenges faced by organizations as they prepare for the future. As a technology delivery company, we are focused on helping organizations lay a foundation for their "Tomorrow-Roadmap".
At the heart of any business is the data that drives decisions. Data integrity and security are key drivers for growth. Smart and timely use of technology can help build, streamline and enable data driven decisions that become the backbone of an organization. Business leaders are constantly searching for new solutions, services and partners they can trust with enabling these drivers.
- Role : Data Engineer- ETL Development
- Location : PAN India
- NP : Immediate Joiners required
- Experience : 5 to 7 years
- Work Mode : WFH
- CTC : Market Standard
About the Role
We are looking for a highly skilled Data Engineer with strong SQL and Python expertise,
hands-on experience in building ETL pipelines, and exposure to modern AI automation
frameworks. The ideal candidate will be comfortable working in Lakehouse architectures
and integrating GenAI, RAG, and LangChain-based solutions into data workflows.
Key Responsibilities
Data Engineering & ETL
Design, develop, and maintain robust ETL / ELT pipelines for various data sources.Work with Lakehouse architecture to manage structured and unstructured data.Ensure data quality, reliability, scalability, and performance across pipelines.Experience using Data Build Tool (DBT)AI Automation & GenAI Stack
Build and integrate GenAI-driven automation solutions using frameworks like LangChainand Retrieval-Augmented Generation (RAG).
Implement workflows involving model-driven reasoning, orchestration, and tool usage(e.g., MCP).
Develop modular components that leverage LLMs to enhance data processing, analytics,and automation.
Programming & APIs
Write clean, efficient, and optimized Python code for data transformation, processing, andautomation tasks.
Work with REST APIs to extract, ingest, or integrate external datasets and services.Collaborate with cross-functional teams to expose internal data services through APIs.Database & SQL
Write and optimize complex SQL queries using PostgreSQL.Design and maintain relational schemas, indexes, and stored procedures for performanceefficiency.
Required Skills & Qualifications
4–5 years of experience in Data Engineering or similar roles.Strong expertise in SQL (PostgreSQL preferred) and Python.Hands-on experience with ETL / ELT development and Lakehouse architecture.Exposure to modern AI / LLM frameworks : LangChain, RAG pipelines, GenAI automation,MCP.
Experience working with REST APIs.Good understanding of software engineering practices, version control (Git), and CI / CD.Nice-to-Have Skills
Experience with cloud platforms (AWS, Azure, GCP).Familiarity with containerization (Docker).Experience orchestrating workflows with Airflow, Prefect, or similar.Basic understanding of vector databases and embeddings.Soft Skills
Strong problem-solving and analytical thinking.Ability to communicate complex topics clearly.Independent, proactive, and curious about modern AI tooling.Team-oriented mindset with a drive for continuous improvement.