About the Role
- - - - What the Candidate Will Do ----
- Build and manage large, reliable, and performant production deployments of Apache Pinot, Presto, Clickhouse.
- Work with related data infrastructure technologies like Apache Kafka, Apache Flink, Apache Spark, HDFS, etc.
- Design and implement distributed and real-time algorithms for low-latency large scale data processing.
- Work with partner teams within Uber to help them build, deploy, and manage business critical real-time analytics applications at scale.
- Contribute and actively engage in open source communities for Apache Pinot and Presto.
- - - - Basic Qualifications ----
- Bachelor’s degree in Computer Science or related field.
- 10+ years of experience building large scale distributed software systems.
- Solid understanding of Java for backend / systems software development.
- - - - Preferred Qualifications ----
- MS / PhD in Computer Science or related field.
- Experience managing production systems with a strong availability SLA.
- Working knowledge of SQL and data analytics at scale.
- Experience working with Apache Pinot, Apache Druid, Presto, Apache Flink, Apache Spark or similar analytics technologies.
* Accommodations may be available based on religious and/or medical conditions, or as required by applicable law. To request an accommodation, please reach out to .