Bachelor’s degree or equivalent practical experience.
3 years of experience in one or more object oriented programming languages (Java, C++, or Python, etc.).
3 years of experience designing data models, data warehouses, and using SQL and NoSQL database management systems along with data processing using traditional and distributed systems (e.g., Hadoop, Spark, Dataflow, Airflow).
Preferred qualifications:
Master’s degree in Business, Statistics, Mathematics, Economics, Engineering or Applied Science, or a related field.
5 years of experience in a customer-facing role.
5 years of experience managing projects and working with analytics, software coding, or customer-side web technologies.
3 years of experience writing and maintaining ETLs which operate on a variety of structured and unstructured sources.
3 years of experience in distributed data processing and Unix or GNU/Linux systems.
Working knowledge of machine learning, including data preparation, model selection, performance evaluation, and parameter tuning.