The point where experts and best companies meet
Share
Key job responsibilities
· Build and deliver high quality data architecture and pipelines to support scientists, product and program managers, and customer reporting needs.
· Standardize, document and socialize metrics to relevant stakeholders.
· Translate basic business problem statements into analysis requirements. Work with internal customers to define best output based on expressed stakeholder needs.
· Rapidly prototype datasets for ad hoc analyses.
· Manage and deliver multiple complex business requests simultaneously.
- 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience
- Experience with data modeling, warehousing and building ETL pipelines
- Experience with data visualization using Tableau, Quicksight, or similar tools
- Experience developing and presenting recommendations of new metrics allowing better understanding of the performance of the business
- Experience in Statistical Analysis packages such as R, SAS and Matlab
- Experience analyzing and interpreting data with Pyspark
These jobs might be a good fit