Expoint - all jobs in one place

המקום בו המומחים והחברות הטובות ביותר נפגשים

Limitless High-tech career opportunities - Expoint

SAP Senior Python Developer Reference Data Management Knowledge Graphs m/f/d 
Germany, Baden-Württemberg 
309195735

09.09.2024

The Signavio content engineering unit leverages SAP HANA and RDF (Resource Description Framework) technology to create an extensive process database containing a wealth of process-related metadata. This database is the basis for shipping value accelerators and for building AI applications, such as large-language models (LLMs) on knowledge graphs.

Within an agile, interdisciplinary, and international environment, you will be working closely with process and data experts from the Value Accelerator Delivery unit (VAD). Your responsibilities cover the entire chain of process-related data management, including semantic data modeling, data loading and integration via extract-transform-load (ETL) tooling, and data quality assurance. You will develop the primary infrastructure to create data products and manage value accelerator content, including data lifecycle management. In addition, you will work on innovation projects in the area of VAD data integration and consumption, including the development, application, and maintenance of data-matching algorithms and question-answering systems.

Responsibilities

  • Drive the design and implementation of a data management infrastructure for process-related (meta) data.
  • Drive the development of a CI/CD infrastructure to manage existing and new content.
  • Automatize manual tasks in the form of pipelines.
  • Implement and administer data loading pipelines for process-related data.
  • Gather and formalize technical requirements from product managers, consultants, and process experts.
  • Productization of research results
  • Work with engineers, architects, project managers, UX experts, and technical writers to drive the development of data products.
  • Analyze and work on the application of machine learning for content curation and consumption.

Experience & Role Requirements

  • Bachelor’s or Master’s degree in computer science, business informatics, data science, or a related field
  • at least three years of experience in professional software development
  • excellent Python programming skills
  • knowledge in TypeScript and Java
  • advanced skills in Kubernetes/Kyma
  • good understanding of data modeling techniques, especially UML
  • experience in data-oriented applications and data-integration patterns (ETL)
  • initial experience in AI and AI-based applications
  • strong willingness to learn new technologies
  • excellent communication skills in English
  • deep knowledge in agile methodologies
  • knowledge in RDF knowledge graph is a big advantage
  • be a team player


Job Segment:Cloud, ERP, Developer, Database, Data Modeler, Technology, Data