Bachelor's degree in Computer Science, Mathematics, a related field, or equivalent practical experience.
3 years of experience with data processing software (e.g., Hadoop, Spark, Pig, Hive) and algorithms (e.g., MapReduce, Flume).
3 years of experience in Google Cloud.
Experience managing client-facing projects, troubleshooting technical issues, and working with Engineering and Sales Services teams.
Experience programming in Python and SQL.
Preferred qualifications:
Experience in technical consulting.
Experience working with data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT, and reporting/analytic tools and environments.
Experience working with Big Data, information retrieval, data mining, or machine learning.
Experience in building multi-tier high availability applications with modern web technologies (e.g., NoSQL, MongoDB, SparkML, TensorFlow).
Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments.