Job Title : Confluent Kafka Developer / Streaming Engineer
Experience Required : 5+ Years
Location : Remote (Work from Home)
Notice Period : Immediate to 30 Days
About the Role :
We are seeking an experienced Confluent Kafka Developer to design, build, and maintain event-driven architectures for real-time data integration across banking systems. The ideal candidate will have strong Java development skills and proven expertise with Kafka Connect, Kafka Streams, and ksqlDB to deliver high-throughput, low-latency streaming solutions in a financial services environment.
Key Responsibilities :
- Develop and maintain event-driven architectures using Confluent Kafka for real-time integration across systems such as core banking, CRM, and fraud detection.
- Design and implement Kafka producers and consumers for handling high-volume, low-latency banking transactions.
- Build reusable streaming components using Kafka Streams and ksqlDB for use cases like fraud detection, customer notifications, and operational alerts.
- Collaborate with the Data Governance team to ensure data lineage, quality, and metadata standards are followed.
- Enforce schema evolution best practices using Confluent Schema Registry to maintain compatibility across multiple banking applications.
- Partner with platform, DevOps, cybersecurity, and business analytics teams for seamless delivery and system monitoring.
- Support data platform architects and project managers on integration roadmaps and impact assessments.
- Work with business units (Retail, Islamic Finance, Risk, Compliance) to translate requirements into scalable Kafka-based solutions.
- Enable real-time use cases such as customer onboarding status, transaction streaming, digital engagement analytics, and branch performance monitoring.
Essential Skills & Qualifications :
Strong Java proficiency (Kafka Connect is Java-based).Hands-on experience with Kafka Connect APIsImplement custom Source / Sink Connectors and Task interfaces using Java.Schema handling using Schema Builder and Struct, with understanding of schema evolution.Implement robust error handling and retry mechanisms.Integrate with external systems (databases, REST APIs, SOAP services, etc.).Debug and monitor using Confluent Connect Logs.Solid understanding of Kafka topics, partitions, and offset management.Proficiency in SQL for data manipulation and validation.Excellent problem-solving and collaboration skills.Experience in banking or financial services domains is a plus.(ref : hirist.tech)