Role Overview
This role focuses on delivering robust, data-driven solutions using modern agile technical practices — including Continuous Integration & Deployment (CI / CD), Test-Driven Development (TDD), and Extreme Programming (XP). You’ll help design and implement scalable data systems while mentoring others to work more effectively and efficiently.
Key Responsibilities
- Collaborate with clients and internal teams to define data processing, storage, and access requirements.
- Design, build, and maintain highly available data pipelines and storage solutions such as data warehouses.
- Deliver high-quality software using agile practices like pair programming, TDD, and CI / CD.
- Automate data infrastructure setup, configuration, and deployments.
- Advocate agile methodologies and best practices within client organizations.
- Mentor and guide client team members to enhance data engineering capabilities.
- Continuously improve team performance and data delivery efficiency.
Required Skills & Experience
Extensive experience developing and managing data pipelines, platforms, and large-scale data projects.Proficiency in at least one major cloud platform (AWS, GCP, or Azure).Strong programming background, preferably in Python .Hands-on experience with various databases, data warehouses, and distributed file systems in production environments.Advanced skills in SQL and data modeling.Proven ability to process and manage data at scale.Understanding of GDPR compliance and data security best practices.Experience with version control and Infrastructure-as-Code tools (e.g., Git / GitHub, Terraform, Ansible ) and CI / CD pipelines.Ability to collaborate effectively with data scientists and BI teams , providing data preparation and mentoring support.Excellent communication and collaboration skills, including working effectively with globally distributed teams.Write to shruthi.s@careerxperts.com