Job Summary :
We are seeking an experienced and skilled Senior Data Architect to join our team. In this role, you will be responsible for designing and implementing scalable data pipelines and workflows.
Key Responsibilities :
- Architect, build, and optimize scalable data pipelines and workflows
- Manage end-to-end ownership of AWS resources : from configuration to optimization and debugging
- Work closely with product and engineering teams to enable high-velocity business impact
- Automate and scale data processes—manual workflows are not part of the culture
- Build foundational data systems that drive critical business decisions
Requirements :
At least 3 years of professional experience in data architectureDemonstrated end-to-end ownership of ETL pipelinesDeep, hands-on experience with AWS services : EC2, Athena, Lambda, and Step FunctionsStrong proficiency in MySQLWorking knowledge of Docker : setup, deployment, and troubleshootingPreferred Skills :
Experience with orchestration tools such as Airflow or similarHands-on with PySparkFamiliarity with the Python data ecosystem : SQLAlchemy, DuckDB, PyArrow, Pandas, NumPyExposure to DLT (Data Load Tool)Ideal Candidate Profile :
The ideal candidate is a self-motivated and independent individual with excellent communication skills. They should have a builder's mindset over a maintainer's and thrive in fast-paced startup environments.
Compensation Range :
₹8.4–12 LPA (fixed base), excluding equity, performance bonus, and revenue share components.