Bachelor's degree or equivalent practical experience.
3 years of experience working with data infrastructure and data models by performing exploratory queries and scripts.
3 years of experience designing data pipelines, and dimensional data modeling for synch and asynch system integration and implementation using internal (e.g., Flume, etc.) and external stacks (DataFlow, Spark, etc.).
3 years of experience coding with one or more programming languages (e.g., Python, Java, R, etc.) for data manipulation, analysis, and automation.
3 years of experience analyzing data, database querying (e.g., SQL), and creating dashboards/reports.
Preferred qualifications:
3 years of experience with statistical methodology and data consumption tools such as business intelligence tools, collabs, jupyter notebooks, Tableau, Power BI, DataStudio, and business intelligence platforms.
3 years of experience partnering with stakeholders (e.g., users, partners, customer), and managing stakeholders/customers.
3 years of experience developing project plans and delivering projects on time within budget and scope.
Experience with machine learning for production workflows.