The Role:
Build ETL pipelines to lead the ingestion of data, metadata, and other development artifacts into our Knowledge Graph.
Extract relevant information from various LoB data sources for further processing and enrichment of our large-scale Knowledge Graph serving the Foundation Model and other LLM use cases.
Contribute to design and build stable and scalable applications.
Make critical design decisions regarding the selection and implementation of underlying technologies.
What you bring
- 5+ years of related professional experience in Software Engineering with at least 2 years of professional experience as a data engineer.
- Bachelor's or master’s degree in computer science, Artificial Intelligence, physics, mathematics, or other relevant disciplines.
- Proficiency in Python as well as experience building ETL pipelines with Metaflow, Airflow or similar frameworks.
- Hands-on experience in any of the cloud stacks (AWS, GCP, Azure, BTP).
- Experience in building data pipeline activities for projects, ideally related to data science projects, e.g. with Large Language Models.
- Understanding of SAP S/4HANA backend (ABAP and Data Dictionary) and data models (e.g., VDM, CDS Views, RAP, OData) is a plus.
- Experience with object stores, relational databases or vector databases.
- Experience with Knowledge Graph technologies (e.g., RDF, SPARQL) is a plus.
- Curiosity and interest to experiment and adopt new technologies and frameworks.
- Strong communication, collaboration and leadership skills, experience with agile methodology and an ability to work effectively in distributed, cross-cultural teams.
Meet your team
- SAP's AI organization is dedicated to seamlessly infusing AI into all enterprise applications, enabling
- customers, partners, and developers to enhance business processes and generate remarkable business
- value. Join our international AI team where innovation thrives, opportunities for personal development abound, and exceptional colleagues collaborate globally.