About the Role
What the Candidate Will Do
- Design, develop, and maintain scalable Kafka infrastructure.
- Optimize Kafka clusters for performance, reliability, and scalability.
- Collaborate with cross-functional teams to understand requirements and deliver solutions that meet business needs.
- Troubleshoot and resolve complex issues related to Kafka and real-time data streaming.
- Implement monitoring and alerting solutions to ensure the health and performance of Kafka clusters.
- Stay current with industry trends and best practices in distributed systems and data streaming technologies.
Basic Qualifications
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- Extensive experience with Apache Kafka and related technologies.
- Strong understanding of distributed systems and stream processing.
- Proficient in programming languages such as Java, or Go.
- Excellent problem-solving skills and the ability to troubleshoot complex issues.
- Strong communication skills and the ability to work collaboratively in a team environment.
Preferred Qualifications
- Experience with messaging and stream processing technologies such as Apache Kafka, Apache Pulsar, Apache Flink, or Apache Storm. Kafka committer is a big plus.
- Experience with highly available/fault-tolerant distributed systems, large-scale data processing systems, or enterprise/cloud storage systems is also a strong plus
For Seattle, WA-based roles: The base salary range for this role is USD$185,000 per year - USD$205,500 per year.
For Sunnyvale, CA-based roles: The base salary range for this role is USD$185,000 per year - USD$205,500 per year.