Purpose of the Job :
Purpose of job is to provide technical support and responsible for end-to-end implementation of Kafka. It will involve creating Kafka instance, topic and consuming the topic using scripting language. Store the topic data based on application and use case requirements.
Years of Experience : only looking for 3-5 years
Location : Thane (Work from Office)
Notice Period : Immediate / 30 Days
Domain Knowledge :
KafkaCloud systems (AWS)SQLPython / PySparkAirflowDockers / ContainersDuties & Responsibilities :
- Design, implementation and maintenance of enterprise RedHat Kafka or open source or any other enterprise edition.
- Create and manage Kafka Clusters on cloud environment & Containers.
- Integrating application with Kafka.
- Well versed with scripting language such as python, spark etc.
- Strong knowledge of message queuing, stream processing architecture
- Contribute to team design discussions with detailed technical information.
- Develop code with quality, scalability and extensibility.
- Provide work breakdown and estimates for complex software development tasks.
- Identify strategic / tactical solutions and provides risk assessments and recommendations.
- Well verse with Linux and cloud based OS.
- Work with multiple teams to ensure best use of Kafka and data safe event streaming.
Qualifications :
Essential qualifications
- Must have experience in end-to-end implementation of Kafka.
- Experience in Python, Spark, Hadoop.
- Experience in ELK, Kibana.
- Experience in Docker and container-oriented infrastructure.
- Ability to multi-task and prioritize in a fast paced, team-oriented environment.
- Bachelor's degree in computer science or equivalent work experience
Work experience & Desired Profile :
Ideally 3+ years of experience and someone who has built an enterprise level Kafka solution before, preferably more than once