NoteNotice period -Max 15 days to 30 daysMinimum 5+ years of development and design experience in Java / Scala / Python with Flink, Beam (or Spark Streaming using Real Time data and not batch data) and Kafka.
- Containerization and orchestration (Docker and OpenShift / Kubernetes)
- Kafka, Flink, Beam / Kafka streams, Apache Spark or similar streaming technologies
- Source control (GIT), automated build / deployment pipelines (Jenkins, ArgoCD, Kaniko, Shipwright etc.)
- Public Cloud preferably Azure and OCI
- Linux OS configuration and using shell scriptingp
- Working with streaming data sets at scale
- Public Cloud automation toolset
- Cloud Native applications
- Understanding of GitOps
- Extensive coding experience and knowledge in event driven and streaming architecture
- Good hands-on experience with design patterns and their implementation
- Experience doing automated unit and integration testing
- Well versed with CI / CD principles (GitHub, Jenkins etc.), and actively involved in solving, troubleshooting issues in distributed services ecosystem
- Familiar with Distributed services resiliency and monitoring in a production environment.
- Responsible for adhering to established policies, following best practices, developing, and possessing an in-depth understanding of exploits and vulnerabilities, resolving issues by taking the appropriate corrective action.
- High level knowledge of compliance and regulatory requirements of data including but not limited to encryption, anonymization, data integrity, policy control features in large scale infrastructures
- Understand data sensitivity in terms of logging, events and in memory data storage– such as no card numbers or personally identifiable data in logs.
- Distributed systems
- Network fundamentals and host-level routing
- Tuning distributed systems
- Automated testing
- Security engineering practices and tools.
- Event driven architecture