Expoint - all jobs in one place

מציאת משרת הייטק בחברות הטובות ביותר מעולם לא הייתה קלה יותר

Limitless High-tech career opportunities - Expoint

Microsoft Principal Research Engineer 
United States, Washington 
3940185

16.07.2024

Required Qualifications:

  • Bachelor's Degree in Computer Science, or related technical discipline AND 6+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python

o OR equivalent experience.

Preferred Qualifications:

  • Bachelor's Degree in Computer Science, or related technical discipline AND 10+ years technical engineering experience with coding in languages including, but not limited to, Python, C, C++, Rust, or C#
    • OR Master's Degree in Computer Science or related technical field AND 8+ years technical engineering experience with coding in languages including, but not limited to, Python, C, C++, Rust, or C#
    • OR equivalent experience
  • Expertise on one of the following preferred:
    • Deep familiarity with transformer-based model inference, including batch processing paradigms for hosted models
    • Expertise in context-free grammar specification and parsing
    • Experience with constrained decoding paradigms (regex-based constraints, grammar based constraints, JSON mode, function calling, etc.)
  • Contribution history to open-source projects, especially in the LLM/AI space
  • Familiarity with the research process and a publication history in AI conferences
  • Familiarity with Python programming paradigms and modern LLM APIs
  • Effective communication skills and desire to collaborate in a multi-disciplinary team
  • Familiarity with Guidance

Certain roles may be eligible for benefits and other compensation. Find additional benefits and pay information here:

Responsibilities
  • Develop and implement new constrained decoding research techniques for increasing LLM inference quality and/or efficiency. Example areas of interest include speculative execution, new decoding strategies (e.g. extensions to beam search), “classifier in the loop” decoding for responsible AI, improving AI planning, and explorations of attention-masking based constraints.
  • Re-imagine the use and construction of context-free grammars (CFG) and beyond to fit Generative AI. Examples of improvements here include better tools for constructing formal grammars, extensions to Earley parsing, and efficient batch processing for constrained generation. Consideration of how these techniques are presented to developers – who may not be well versed in grammars and constrained generation -- in an intuitive, idiomatic programming syntax is also top of mind.
  • Design principled evaluation frameworks and benchmarks for measuring the effects of constrained decoding on a model. Some areas of interest to study carefully include efficiency (token throughput and latency), generation quality, and impacts of constrained decoding on AI safety.
  • Publish your research in top AI conferences and contribute your research advances to the guidance open-source project.
  • Embody our