Strong Quantitative Foundation and Technical Skills: Bachelor's degree in Computer Science, Electrical Engineering, or a related quantitative field (strong mathematical foundations, software engineering, broad knowledge of data analysis and practical machine learning)
Data Engineering and Analytics: Skilled at scalably transforming raw data into actionable insights through practical problem formulation followed by building of ETL processes (e.g. Python & Spark) and data visualizations (e.g. Tableau)
Business Acumen and Problem-Solving: Ability to understand the broader business context, solve complex problems, and communicate findings effectively to stakeholders.
Adaptability and Collaboration: Comfortable with ambiguity, eager to learn, and capable of working effectively in a collaborative environment. Strong interpersonal skills and the ability to build relationships with diverse stakeholders are essential.
M.S. or Ph.D. in Computer Science, Electrical Engineering, Applied Mathematics, Statistics, or a similar quantitative field, with strong statistical skills and intuition
Proficiency in distributed compute & storage technologies such as HDFS, S3, Iceberg, Spark, and Trino
Proficiency with designing ETL flows and automation/scheduling (e.g. Kubernetes and Airflow)
Working knowledge of Operating Systems (memory management, thread/process lifecycles, file systems, etc.)
Experience driving cross-functional projects with diverse sets of stakeholders
Skilled at connecting data insights to the company's overall strategy and objectives.
Note: Apple benefit, compensation and employee stock programs are subject to eligibility requirements and other terms of the applicable plan or program.