Master’s degree or foreign equivalent in Computer Science or related field.
Experience and/or education must include:
Using Java, Python or C++ programming languages to write client-server programs.
Parallel Programming, Multi-Threading, Locking in distributed systems.
Databases Internals knowledge of Key/Value stores like Cassandra, MongoDB, RDBMS' like MySQL to effectively leverage their internal state and data structures to create a faster data replication paradigm.
Distributed data processing with Map Reduce paradigms using Apache Hadoop frameworks to analyze data.
Concurrent Algorithms and Data Structures to effectively handle thousands, if not millions of clients, simultaneously to minimize cost.
Stream Data Processing with tools such as Apache Spark for streaming data solutions.
Source control systems such as Git and SVN to effectively collaborate on medium to large scale projects, track source changes allowing auditability.
TCP/IP, RPC protocols, HTTP and Routing algorithms for Networking and Inter Service communication.