Finding the best job has never been easier
Share
Primary Job Responsibilities
Develop innovative capabilities for open source platforms used for building, training, testing, and serving models for AI-enabled applications, carrying your work through all phases of software development and delivery, including: researching and conceptualization, team design review, writing the implementation, authoring a testing strategy and necessary automation, productizing and packaging the code, and supporting its deployment through to end-users.
Monitor and participate in upstream open source AI/ML communities, evaluating new AI/ML-related technologies in the space and considering potential integrations and collaborations upstream.
Regularly reading papers and keeping up with AI/ML developments, particularly in the Generative AI space.
Promote and foster Red Hat’s open-source value proposition as it pertains to AI/ML engineering and product infusion and development.
Contribute to the development of the open-source projects that comprise Red Hat’s AI family of products.
Take part in the shared responsibility of delivering and maintaining assigned products.
Regularly communicate with project stakeholders including other teams of Red Hat engineers, product managers, consultants, management, and senior leadership.
Take on the role of Subject Matter Expert, as needed, per domain and project.
Lead, coach and collaborate with junior engineers as they build AI/ML knowledge and skills.
Coordinate and collaborate with external teams, including IBM Research on key strategic vision and implementation details.
Share our experiments and learnings from an experienced position in the Data Science community through blogs, presentations, new ideas and existing work at various technical outlets and conferences.
Promote upstream acceptance and community building.
Required Skills
Bachelor's degree in computer science, data science, computer engineering or equivalent
5+ years of experience as a data scientist (or similar roles).
Extensive, advanced experience with Python development.
Experience with AI and Machine Learning platforms, tools, and frameworks, such as: Tensorflow, PyTorch, LLaMA.cpp, and Kubeflow.
Experience working with Kubernetes/OpenShift and containers, troubleshooting issues, and working with YAML.
Experience with Cloud Native Technologies and Platforms (e.g. Kubernetes)
Demonstrates knowledge of unit testing frameworks and methodologies
Demonstrates knowledge of machine learning relevant mathematics and statistics.
Strong self-motivation and organizational skills.
Demonstrates ability to context switch between multiple concurrent projects.
Excellent English written and verbal communication skills.
Collaborative attitude and willingness to share ideas openly.
Ability to quickly learn and use new tools and technologies.
Passion for developing open source software.
Nice to Haves
Familiarity with participating in an agile development team
Experience writing Kubernetes controllers and operators.
Experience creating Ansible automation scripts.
Experience with big data storage techniques, such as Parquet, Avro, and S3.
Experience with hardware accelerators, such as GPU, CUDA, and ROCm.
Understanding of DevOps methodology.
These jobs might be a good fit