As part of this group, you will contribute to build the infrastructure to power the development of multi-modal agentic workflows.
Strong background in computer science: algorithms, data structures and system design
Strong python skills
3+ year experience on large scale distributed system design, operation and optimization
Experience with SQL/NoSQL database technologies, data warehouse frameworks like BigQuery/Snowflake/RedShift/Iceberg and data pipeline frameworks like GCP Dataflow/Apache Beam/Spark/Kafka
Experience with Vector Databases
Experience deploying and serving of LLMs
Experience processing data for ML applications at scale
Excellent interpersonal skills able to work independently as well as cross-functionally
Experience fine-tuning and evaluating Large Language Models
Note: Apple benefit, compensation and employee stock programs are subject to eligibility requirements and other terms of the applicable plan or program.