Experience : 6-10 years
Must have technical skill : Python, AWS, Kafka, Terraform, SQL, Git
Job Description :
We are seeking a highly skilled Backend Developer to join the Cloud Data Hub (CDH) Team in India The ideal candidate is a backend development expert with proficiency in Python, AWS, Kafka, Terraform, and Git, and a passion for building scalable and efficient systems. This role will involve designing, developing, and maintaining backend solutions for the CDH platform, contributing to Client's transformation into a fully data-driven organization.
Role & Responsibilities :
- Design, develop, and maintain backend systems for the CDH platform, ensuring robust, scalable, and efficient solutions.
- Build and enhance serverless architectures and REST APIs using Python and AWS services.
- Implement and manage Kafka data streaming pipelines for real-time data processing and metadata orchestration.
- Develop and deploy infrastructure using Terraform for infrastructure-as-code automation on AWS.
- Utilize Git for version control and collaborate with the team on code reviews and CI / CD pipelines.
- Apply Test-Driven Development (TDD) principles to ensure code reliability, maintainability, and high-quality deliverables.
- Ensure the backend systems comply with Clients security standards, performance metrics, and scalability requirements.
- Proactively identify, debug, and resolve performance bottlenecks and system issues.
- Contribute to technical documentation and knowledge sharing to ensure project continuity and team alignment.
- Expert-level proficiency in backend development with Python.
- Strong experience with AWS cloud services, including Lambda, S3, DynamoDB, API Gateway, and other serverless offerings.
- Hands-on expertise with Kafka for building and managing data streaming solutions.
- Advanced skills in Terraform for infrastructure automation and management.
- Proficient in writing optimized SQL queries and working with relational databases.
- In-depth knowledge of Git for version control and experience with CI / CD pipelines.
- Experience in building distributed systems and handling large-scale, real-time data processing workloads.
- Strong understanding of system design, scalability, and security best practices.
- Excellent debugging and problem-solving skills, with a detail-oriented mindset.
- Good communication and interpersonal skills to collaborate effectively with cross-functional teams.
- Experience working with Docker and containerized environments.
- Familiarity with agile frameworks and participation in Scrum ceremonies.
- Knowledge of monitoring and observability tools like CloudWatch, Prometheus, or Grafana.
- Certification in AWS Solutions Architecture or related AWS certifications is a plus
ref : hirist.tech)