

Share
What you'll be doing:
Work towards building applications and developing infrastructure for a highly scalable deep learning platform.
Design, build, and maintain high bandwidth data-pipelines and related infrastructure, such as APIs, databases and services.
Work closely with domain experts, research teams and take ownership of productizing, releasing and maintaining deep learning products.
Be an engineering generalist. Discover and build skills needed at different times towards solving problems at hand.
What we need to see:
Bachelor’s/Master's degree in computer science, engineering, or related field (or equivalent experience)
Minimum 6+ years of experience in services, pipelines, API development and system design.
Expertise in Python, JavaScript or another similar programming language
Zeal to learn and perform beyond prior experience and expertise.
Ways to stand out from the crowd:
Experience in dev-ops, full-stack development, databases and cloud computing.
Fundamentals in machine learning and experience in building RAG pipeline or other LLM applications.
You will also be eligible for equity and .
These jobs might be a good fit