talented Data EngineerWe run a SaaS-based stack using Snowflake and dbt.
WHAT YOU'LL DO
- Design, build, and maintain data pipelines, datasets and catalogs for fast-growing products and business groups.
- Develop self-service data analytics solutions and infrastructure.
- Support ad hoc needs and requests of internal stakeholders.
- Collaborate with analysts, engineers, and internal customers from Product, Finance, Revenue, and Marketing.
WHAT YOU'LL BRING
- Bachelor's or master's degree in Mathematics, Statistics, Data Science, Computer Science, Computer Engineering, or related field
- 3+ years of experience working as a Data Engineer, including end-to-end designing, orchestrating, and building cloud-based data pipelines (e.g., Airflow, Prefect, Dagster).
- 3+ years of experience with dimensional data modeling and data warehouse implementation, specifically MPP databases like Snowflake, Redshift, and BigQuery.
- Strong knowledge of Python and Python-based data analysis tools such as Jupyter Notebooks and pandas.
- Strong SQL writing skills. Ability to write highly performant queries.
- Strong track record of executing projects independently in dynamic environments.
- Fast understanding of data and business needs and ability to translate them into data models.
- Strong team player with excellent communication skills.
BONUS SKILLS
- dbt experience or knowledge.