Process Overview*
.
Responsibilities*
Key responsibilities include:
- Design, implement, and maintain Kafka producers, consumers, and streams for real-time data processing.
- Configure and manage Kafka topics, partitions, and brokers to ensure high availability and scalability.
- Optimize Kafka performance, including tuning configurations and monitoring metrics.
- Develop and maintain Java-based microservices and applications that integrate with Kafka.
- Write clean, efficient, and maintainable code following best practices.
- Implement RESTful APIs and integrate them with Kafka for data ingestion and processing.
- Collaborate with architects and other developers to design scalable and fault-tolerant systems.
- Implement event-driven architectures and real-time streaming pipelines.
- Ensure data consistency and reliability in distributed systems.
- Monitor Kafka clusters and troubleshoot issues related to brokers, topics, and consumers.
- Debug and resolve performance bottlenecks in Java applications and Kafka pipelines.
- Implement logging, monitoring, and alerting for Kafka and Java applications.
- Work closely with cross-functional teams, including DevOps, QA, and product teams, to deliver high-quality solutions.
- Participate in code reviews, design discussions, and agile ceremonies.
- Create and maintain technical documentation for Kafka configurations, Java services, and data pipelines.
Education*
- Any Graduation / Any Post Graduation
Experience Range*
Foundational skills*
- Confluent Kafka, Java, Database like Db2, Oracle and Singlestore.
Desired skills*
- Datastage, Spring boot, Devops.
Work Timings*:11 AM to 8 PM
Chennai