In this role working on eval infrastructure for search you will be helping on empowering engineers working on relevance with their experimentation needs, letting them iterate more quickly on their ideas within the infrastructure that serves our large scale indexes. For example you would help them by designing end-to-end solutions that allow them to get insights into the impact their work have on the search quality, or enable them to evaluate with confidence the changes they make.The typical tasks encompass:•Designing and developing solutions to enable and orchestrate reliable data extraction and analysis at scale.•Developing and integrating experimentation-focused systems that accelerate the iterations with ML models against large indexes.•Building tooling that let engineers conduct opportunity analysis and identify where they can bring value most.•Designing and implementing systems that integrate with our retrieval augmented generation and have insights into how these components behave.•Designing features and systems that enable to perform retrieval on large token and embeddings-based indexes.•Streamlining onboarding and experimentation experience to our search systems to empower other teams to more efficiently use our components and iterate faster on their relevance improvements.