SPN Globe :
SPN Globe is a premier firm providing a comprehensive consultancy and staffing solutions for a wide range of domains in IT. We have positioned as a trusted partner for organizations seeking top-tier, niche skills talent with a focus on quality, integrity, and timely delivery. We have successfully navigated the challenges posed by a volatile IT market, consistently expanding its reach. By maintaining a client-first approach and leveraging innovative recruitment strategies, the company has continued to grow steadily in the face of economic fluctuations with win-win approach.
Apply immediately to grab it, email your resume at sayali.k@spnglobe.com Also, immediately refer this opportunity to your friends
Company : IT Company
Position : Senior Software Engineer
Type : Permanent (No third-party payroll)
Experience : 5-12 years
Location : Pune / Nagpur
JOB DESCRIPTION :
- Develop and maintain event-driven architectures using Confluent Kafka for real-time integration across banking systems (core banking, CRM, fraud systems, etc.).
- Design and implement Kafka producers and consumers to handle high-throughput, low-latency banking transactions.
- Develop reusable streaming components using Kafka Streams and ksqlDB for fraud detection, customer notifications, and operational alerts.
- Collaborate with customers Data Governance team to ensure data lineage, quality, and metadata standards are maintained throughout the streaming architecture.
- Enforce schema evolution best practices using Confluent Schema Registry to manage compatibility across banking applications.
- Coordinate with platform, DevOps, cybersecurity, and business analytics teams to ensure seamless delivery and monitoring.
- Support data platform architects and project managers on integration roadmaps and impact assessments
- Enable use cases like real-time customer onboarding status, transaction streaming, digital engagement analytics, and branch performance monitoring.
- Work with business units (Retail, Islamic Finance, Risk, Compliance) to gather requirements and translate them into scalable Kafka-based solutions.
For Confluent below are the essential expectation, you must prepare on :
Java proficiency : Kafka Connect is Java-based, so strong Java skills are a must.Kafka Connect APIs proficiency : Implement Source / Sink custom Connectors and Task interfaces using Java.Schema handling : Use Schema Builder, Struct, and understand schema evolution.Error handling & retries : Build robust connectors that gracefully handle failures.External system APIs : Understand how to connect to databases, REST APIs, Soap services, etc.Connector logs : Use Confluent Connect Logs to debug connector behaviorUnderstanding of the Kafka topics, partitions & offset managementSQL proficiency(ref : hirist.tech)