Unique opportunity to contribute to the development of Knowledge Graph solutions covering SAP domain knowledge and providing grounding to LLM applications (e.g. Joule).
Build ETL pipelines to lead the ingestion of data, metadata, and other development artifacts into our Knowledge Graph.
Partner with domain experts across different lines of business (LoB) at SAP, understand and interpret their data, metadata, and other development artifacts and make it consumable in our Knowledge Graph.
Extract relevant information from various LoB data sources for further processing and enrichment of our large-scale Knowledge Graph serving the Foundation Model and other LLM use cases.
Contribute to design and build stable and scalable applications.
What you bring
Bachelor's or master’s degree in computer science, Artificial Intelligence, physics, mathematics, or other relevant disciplines
2+ years of related hands-on experience with pipelines ingesting and transforming data in data science projects
Proficiency in Python
Experience building ETL pipelines with Metaflow, Airflow or similar frameworks is a plus
Hands-on experience with any of the cloud stacks (AWS, GCP, Azure)
Understanding of SAP S/4HANA backend (ABAP and Data Dictionary) and data models (e.g., VDM, CDS Views, RAP, OData) or experience with BTP is a plus
Experience with object stores, relational databases or vector databases is a must.
Experience with Knowledge Graph technologies (e.g., RDF, SPARQL) is a plus.
Curiosity and interest to experiment and adopt new technologies and frameworks.
Ability to work effectively in distributed, cross-cultural teams.