Bachelor’s degree or equivalent practical experience.
2 years of experience with developing large-scale infrastructure, distributed systems or networks, or experience with compute technologies, storage or hardware architecture.
2 years of experience with software development and coding or 1 year of experience with an advanced degree in an industry setting.
Experience with Distributed Systems.
Preferred qualifications:
Master's degree or PhD in Computer Science or a related technical field.
Experience in one or more of the following: High-Performance Scalable Backends, Large-Scale Data Processing, Large-Scale Distributed Systems, Large-Scale Systems, System Design.
Experience in one or more of the following: Distributed Computing, Distributed Data Analytics, Distributed Processing.
Experience with messaging systems (Apache Kafka, Cloud Computing Platform SQS/SNS/Kinesis/Event Hubs, RabbitMQ).
Experience in one or more of the following: Apache Hadoop/Spark (open source data analytics tools), Cloud Computing Platforms, Big Data Technologies (Apache Spark, Apache Druid, Apache Hive, Apache Flink, Presto).