Bachelor's degree in a quantitative discipline (e.g., Computer Science, Engineering, Statistics, Math), or equivalent practical experience.
5 years of experience designing data pipelines, and dimensional data modeling for synchronous and asynchronous system integration and implementation using internal (e.g., Flume, etc.) and external stacks (DataFlow, Spark, etc.).
5 years of experience working with data infrastructure and data models by performing exploratory queries and scripts in SQL.
5 years of experience coding in one or more programming languages.
Experience with data warehousing concepts and technologies, such as BigQuery.
Preferred qualifications:
Master’s degree in a quantitative discipline (e.g., Computer Science, Engineering, Statistics, Math), or a related field.
Experience with Google Cloud Platform (GCP) services, including Cloud Storage, and Data Catalog.
Experience with data quality testing and validation frameworks.
Experience with data visualization tools like Looker.
Understanding of data retention policies and their implementation.