The Difference You Will Make:
Data can transform how a company operates; high data quality and tooling is the biggest lever to achieving that transformation. You will make that happen.
A Typical Day:
- Understand data needs by interfacing with fellow Analytics Engineers, Data Scientists, Data Engineers, and Business Partners
- Architect, build, and launch efficient & reliable new data models and pipelines in partnership with Data Engineering
- Design, define, and implement metrics and dimensions to enable analysis and predictive modeling
- Become a data expert in your business domain and own data quality
- Build tools for auditing, error logging, and validating data tables
- Build and improve data tooling in partnership with internal Data Platform teams
- Define logging needs in partnership with Data Engineering
- Design and develop dashboards to enable self-serve data consumption
Your Expertise:
- Passion for building high quality assets and scaling data
- 5+ years of relevant industry experience
- Strong skills in SQL and distributed system optimization (e.g. Spark, Presto, Hive)
- Expert in at least one programming language for data analysis (e.g. Python, R)
- Experience in schema design and dimensional data modeling
- Ability to perform basic statistical analysis to inform business decisions
- Proven ability to succeed in both collaborative and independent work environments
- Detail oriented and excited to learn new skills and tools
Preferred qualifications:
- Experience with an ETL framework like Airflow
- Python, Scala, Superset, and Tableau skills preferred
- An eye for design when it comes to dashboards and visualization tools