Master’s degree or foreign equivalent in Computer Science, Computer Engineering or related field and 4 years of experience in the job offered or related occupation.
4 years of experience with each of the following skills:
Using Java includes core Java, concurrency, non-blocking IO, performance tuning to develop high throughput, large scale Observability platform.
Using Python to build developer productivity tools to speed up developer efficiency.
Using BigData Technologies like Spark, Flink, or Hadoop to process large datasets.
Familiarity with time series database internals.
Experience in building highly scalable, fault tolerant distributed Telemetry platform processing high volumes of data.
Using AWS, Azure, or Kubernetes to run systems in Cloud Technologies.
Proficiency in Apache Kafka for building real-time data streaming and event-driven architectures, including aggregation engines, stream alerting systems.
Strong API design and implementation skills for building scalable and high throughput servers to serve telemetry dashboard queries.