Job Title : Kafka Engineer
Experience : 5 to 7 years
Job Summary :
We are seeking a skilled Kafka Engineer to design, build, and maintain real-time streaming pipelines using Apache Kafka. The role involves developing Java-based producers / consumers, ensuring scalability, fault tolerance, and schema governance, and integrating Kafka into CI / CD pipelines. Youll collaborate with DevOps teams to manage infrastructure, security, and automation while enabling enterprise-wide real-time data processing.
Key Responsibilities :
- Design and maintain high-throughput, low-latency Kafka streaming pipelines.
- Develop robust Kafka producers and consumers in Java.
- Implement schema governance using Confluent Schema Registry (Avro / Protobuf).
- Collaborate with DevOps for CI / CD, deployment, and infrastructure management.
- Ensure security best practices for Kafka clusters (encryption, authentication, access control).
- Monitor and troubleshoot Kafka performance, latency, and message delivery.
- Document architecture, deployment, and operational processes.
- Participate in code reviews and knowledge-sharing sessions.
Must-Have Skills :
5- 7 years in backend / data engineering with Apache Kafka experience.Strong Java skills for Kafka integration.Deep understanding of Kafka internals (brokers, partitions, consumer groups, offsets, replication).Experience with schema management (Avro / Protobuf) and CI / CD pipelines.Ability to collaborate with DevOps on Kafka infrastructure, security, and automation.Solid knowledge of real-time data processing, fault tolerance, and data consistency.Good-to-Have Skills :
Experience with Kafka Streams, ksqlDB, or Apache Flink.Knowledge of Kafka Connect and connectors for data ingestion / export.Exposure to Confluent Kafka Platform, REST Proxy, and monitoring tools (Grafana, ELK).Familiarity with Kubernetes / OpenShift and cloud-native Kafka (AWS MSK).Spring Boot experience for Kafka integration in microservices.(ref : hirist.tech)