מציאת משרת הייטק בחברות הטובות ביותר מעולם לא הייתה קלה יותר
This job requires you to constantly hit the ground running and have the ability to learn quickly. Primary responsibilities include defining the problem and building analytical frameworks to help the operations to streamline the process, identifying gaps in the existing process by analyzing data and liaising with relevant team(s) to plug it and analyzing data and metrics and sharing update with the internal teams.Key job responsibilities
1) Apply multi-domain/process expertise in day to day activities and own end to end roadmap.
2) Translate complex or ambiguous business problem statements into analysis requirements and maintain high bar throughout the execution.
3) Define analytical approach; review and vet analytical approach with stakeholders.
4) Proactively and independently work with stakeholders to construct use cases and associated standardized outputs
5) Scale data processes and reports; write queries that clients can update themselves; lead work with data engineering for full-scale automation7) Work with a variety of data sources and Pull data using efficient query development that
requires less post processing (e.g., Window functions, virt usage)
8) When needed, pull data from multiple similar sources to triangulate on data fidelity10) Provide program communications to stakeholders
11) Communicate roadblocks to stakeholders and propose solutions[January 21, 2025, 1:30 PM] Dhingra, Gunjit: Day in lifeA day in the life
1) Solve ambiguous analyses with less well-defined inputs and outputs; drive to the heart of the problem and identify root causes
2) Have the capability to handle large data sets in analysis through the use of additional tools4) Understand the basics of test and control comparison; may provide insights through basic statistical measures such as hypothesis testing
5) Identify and implement optimal communication mechanisms based on the data set and the stakeholders involvedWe enable operations to make data-driven decisions through developing near real-time dashboards, self-serve dive-deep capabilities and building advanced analytics capabilities.
- 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience
- Experience with data visualization using Tableau, Quicksight, or similar tools
- Experience with data modeling, warehousing and building ETL pipelines
- Experience in Statistical Analysis packages such as R, SAS and Matlab
- Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling
משרות נוספות שיכולות לעניין אותך