Finding the best job has never been easier
Share
Key job responsibilities
- Interface with Product Managers, Data/Applied Scientists, Software Engineers and Program Managers to determine requirements for data pipeline, storage, processing and reporting.
- Build scalable and production-ready data pipelines to extract features from petabytes of raw data.
- Manage data refreshes and perform systematic quality data checks at scale.- Recognizing and adopting best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation.
- 5+ years of data engineering experience
- Experience with data modeling, warehousing and building ETL pipelines
- Experience with SQL
- Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS
- Experience mentoring team members on best practices
- Experience with big data technologies such as: Hadoop, Hive, Spark, EMR
- Experience operating large data warehouses
- Experience working with benefits systems, insurance/payor systems, and/or healthcare IT systems
These jobs might be a good fit