Bachelor's degree or equivalent practical experience.
3 years of experience coding in one or more programming languages.
3 years of experience working with data infrastructure and data models by performing exploratory queries and scripts.
3 years of experience designing data pipelines, and dimensional data modeling for synch and asynch system integration and implementation using internal (e.g., Flume, etc.) and external stacks (DataFlow, Spark, etc.).
Preferred qualifications:
3 years of experience with statistical methodology and data consumption tools such as business intelligence tools, collabs, jupyter notebooks, Tableau, Power BI, DataStudio, and business intelligence platforms.
3 years of experience partnering with stakeholders (e.g., users, partners, customer), and managing stakeholders/customers.
3 years of experience developing project plans and delivering projects on time within budget and scope.
Experience with Machine Learning for production workflows.
Understanding of incrementality measurement methodologies and frameworks