Expoint – all jobs in one place
מציאת משרת הייטק בחברות הטובות ביותר מעולם לא הייתה קלה יותר

דרושים Sap Mdm & Data Governance Lead ב-Ey ב-Germany, Mannheim

מצאו את ההתאמה המושלמת עבורכם עם אקספוינט! חפשו הזדמנויות עבודה בתור Sap Mdm & Data Governance Lead ב-Germany, Mannheim והצטרפו לרשת החברות המובילות בתעשיית ההייטק, כמו Ey. הירשמו עכשיו ומצאו את עבודת החלומות שלך עם אקספוינט!
חברה (1)
אופי המשרה
קטגוריות תפקיד
שם תפקיד (1)
Germany
Mannheim
נמצאו 2 משרות
Yesterday
EY

EY SAP Data Engineer / Integration Specialist Germany, Baden-Württemberg, Mannheim

Limitless High-tech career opportunities - Expoint
Build and maintain ETL/ELT data pipelines and integrations to support SAP BDC migration and transformation workflows. Develop scalable data flows connecting SAP systems with cloud platforms such as Databricks, Azure...
תיאור:

Job Summary

As a Data Engineer / Integration Specialist within the EY SAP Enterprise Data Management Initiative run by SAP Platforms & Assets, you will be responsible for designing, building, and optimizing scalable data pipelines and system integrations for data management and transformation projects.

You will be part of a global team creating an advanced, cloud-enabled data platform that uses technologies from SAP, Databricks, Snowflake, NVIDIA and Microsoft. Your focus will be to enable seamless data movement, transformation, and integration between SAP systems and modern data platforms, ensuring data availability and quality across multiple environments.

This role requires a hands-on, technically proficient individual with deep experience in both SAP integration and modern cloud-native data engineering.

Essential Functions of the Job

  • Build and maintain ETL/ELT data pipelines and integrations to support SAP BDC migration and transformation workflows.
  • Develop scalable data flows connecting SAP systems with cloud platforms such as Databricks, Azure Synapse, and Snowflake.
  • Integrate SAP data (IDocs, BAPIs, flat files) with modern data lakes, warehouses, and analytical tools using Databricks, Nvidia RAPIDS, and other technologies.
  • Optimize data transformation jobs for performance, reliability, and maintainability in hybrid or multi-cloud setups.
  • Collaborate with architects to ensure integration solutions align with enterprise data strategy and standards.
  • Apply best practices in data security, encryption, data masking, and compliance within integration pipelines.
  • Develop reusable scripts, connectors, and data wrangling logic across SAP and cloud-native environments.
  • Monitor data jobs, troubleshoot failures, and perform root cause analysis to resolve complex data movement issues.
  • Use CI/CD practices to automate deployment of data jobs across dev/test/prod environments.
  • Document technical specifications, mappings, job flows, and operational procedures to support long-term maintainability.

Knowledge and Skills Requirements

  • Deep experience with SAP BDC, data migration, and SAP integration techniques (LSMW, IDocs, BAPIs, BDC recordings).
  • Strong proficiency in ETL tools and frameworks, e.g., SAP BODS, Azure Data Factory, Informatica, or Talend.
  • Hands-on with cloud-based integration services, e.g., AWS Glue, Google Dataflow, Snowflake Tasks/Streams, or Databricks Workflows.
  • Familiarity with cloud data platforms (Azure Synapse, Google BigQuery, Snowflake) and parallel compute frameworks (Nvidia RAPIDS, PySpark).
  • Strong skills in SQL, scripting (Python, Shell), and version control (Git).
  • Knowledge of API integrations, message queues, and event-driven data pipelines.
  • Experience in data quality validation and exception handling within pipelines.
  • Comfortable working in Agile delivery and CI/CD environments.

Other Requirements

  • Strong collaboration skills in global delivery models (onshore/offshore).
  • Certifications in relevant tools or cloud platforms (e.g., Azure Data Engineer, AWS Big Data) are a plus.
  • Working experience in regulated or large enterprise data environments is desirable.
  • Ability to travel based on project or client needs.

Job Requirements

  • Education
    • BS/MS in Computer Science, Data Engineering, Information Systems, or related field.
  • Experience
    • 8–12+ years of experience in data engineering and system integration, with 3–5 years focused on SAP data pipelines and cloud integration technologies.

What we offer you

develop you with future-focused skills and equip you with world-class experiences.

To help createan equitable

Show more
18.11.2025
EY

EY SAP MDM & Data Governance Lead Germany, Baden-Württemberg, Mannheim

Limitless High-tech career opportunities - Expoint
Build and maintain ETL/ELT data pipelines and integrations to support SAP BDC migration and transformation workflows. Develop scalable data flows connecting SAP systems with cloud platforms such as Databricks, Azure...
תיאור:

Job Summary

As a Data Engineer / Integration Specialist within the EY SAP Enterprise Data Management Initiative run by SAP Platforms & Assets, you will be responsible for designing, building, and optimizing scalable data pipelines and system integrations for data management and transformation projects.

You will be part of a global team creating an advanced, cloud-enabled data platform that uses technologies from SAP, Databricks, Snowflake, NVIDIA and Microsoft. Your focus will be to enable seamless data movement, transformation, and integration between SAP systems and modern data platforms, ensuring data availability and quality across multiple environments.

This role requires a hands-on, technically proficient individual with deep experience in both SAP integration and modern cloud-native data engineering.

Essential Functions of the Job

  • Build and maintain ETL/ELT data pipelines and integrations to support SAP BDC migration and transformation workflows.
  • Develop scalable data flows connecting SAP systems with cloud platforms such as Databricks, Azure Synapse, and Snowflake.
  • Integrate SAP data (IDocs, BAPIs, flat files) with modern data lakes, warehouses, and analytical tools using Databricks, Nvidia RAPIDS, and other technologies.
  • Optimize data transformation jobs for performance, reliability, and maintainability in hybrid or multi-cloud setups.
  • Collaborate with architects to ensure integration solutions align with enterprise data strategy and standards.
  • Apply best practices in data security, encryption, data masking, and compliance within integration pipelines.
  • Develop reusable scripts, connectors, and data wrangling logic across SAP and cloud-native environments.
  • Monitor data jobs, troubleshoot failures, and perform root cause analysis to resolve complex data movement issues.
  • Use CI/CD practices to automate deployment of data jobs across dev/test/prod environments.
  • Document technical specifications, mappings, job flows, and operational procedures to support long-term maintainability.

Knowledge and Skills Requirements

  • Deep experience with SAP BDC, data migration, and SAP integration techniques (LSMW, IDocs, BAPIs, BDC recordings).
  • Strong proficiency in ETL tools and frameworks, e.g., SAP BODS, Azure Data Factory, Informatica, or Talend.
  • Hands-on with cloud-based integration services, e.g., AWS Glue, Google Dataflow, Snowflake Tasks/Streams, or Databricks Workflows.
  • Familiarity with cloud data platforms (Azure Synapse, Google BigQuery, Snowflake) and parallel compute frameworks (Nvidia RAPIDS, PySpark).
  • Strong skills in SQL, scripting (Python, Shell), and version control (Git).
  • Knowledge of API integrations, message queues, and event-driven data pipelines.
  • Experience in data quality validation and exception handling within pipelines.
  • Comfortable working in Agile delivery and CI/CD environments.

Other Requirements

  • Strong collaboration skills in global delivery models (onshore/offshore).
  • Certifications in relevant tools or cloud platforms (e.g., Azure Data Engineer, AWS Big Data) are a plus.
  • Working experience in regulated or large enterprise data environments is desirable.
  • Ability to travel based on project or client needs.

Job Requirements

  • Education
    • BS/MS in Computer Science, Data Engineering, Information Systems, or related field.
  • Experience
    • 8–12+ years of experience in data engineering and system integration, with 3–5 years focused on SAP data pipelines and cloud integration technologies.

What we offer you

develop you with future-focused skills and equip you with world-class experiences.

To help createan equitable

Show more
Limitless High-tech career opportunities - Expoint
Build and maintain ETL/ELT data pipelines and integrations to support SAP BDC migration and transformation workflows. Develop scalable data flows connecting SAP systems with cloud platforms such as Databricks, Azure...
תיאור:

Job Summary

As a Data Engineer / Integration Specialist within the EY SAP Enterprise Data Management Initiative run by SAP Platforms & Assets, you will be responsible for designing, building, and optimizing scalable data pipelines and system integrations for data management and transformation projects.

You will be part of a global team creating an advanced, cloud-enabled data platform that uses technologies from SAP, Databricks, Snowflake, NVIDIA and Microsoft. Your focus will be to enable seamless data movement, transformation, and integration between SAP systems and modern data platforms, ensuring data availability and quality across multiple environments.

This role requires a hands-on, technically proficient individual with deep experience in both SAP integration and modern cloud-native data engineering.

Essential Functions of the Job

  • Build and maintain ETL/ELT data pipelines and integrations to support SAP BDC migration and transformation workflows.
  • Develop scalable data flows connecting SAP systems with cloud platforms such as Databricks, Azure Synapse, and Snowflake.
  • Integrate SAP data (IDocs, BAPIs, flat files) with modern data lakes, warehouses, and analytical tools using Databricks, Nvidia RAPIDS, and other technologies.
  • Optimize data transformation jobs for performance, reliability, and maintainability in hybrid or multi-cloud setups.
  • Collaborate with architects to ensure integration solutions align with enterprise data strategy and standards.
  • Apply best practices in data security, encryption, data masking, and compliance within integration pipelines.
  • Develop reusable scripts, connectors, and data wrangling logic across SAP and cloud-native environments.
  • Monitor data jobs, troubleshoot failures, and perform root cause analysis to resolve complex data movement issues.
  • Use CI/CD practices to automate deployment of data jobs across dev/test/prod environments.
  • Document technical specifications, mappings, job flows, and operational procedures to support long-term maintainability.

Knowledge and Skills Requirements

  • Deep experience with SAP BDC, data migration, and SAP integration techniques (LSMW, IDocs, BAPIs, BDC recordings).
  • Strong proficiency in ETL tools and frameworks, e.g., SAP BODS, Azure Data Factory, Informatica, or Talend.
  • Hands-on with cloud-based integration services, e.g., AWS Glue, Google Dataflow, Snowflake Tasks/Streams, or Databricks Workflows.
  • Familiarity with cloud data platforms (Azure Synapse, Google BigQuery, Snowflake) and parallel compute frameworks (Nvidia RAPIDS, PySpark).
  • Strong skills in SQL, scripting (Python, Shell), and version control (Git).
  • Knowledge of API integrations, message queues, and event-driven data pipelines.
  • Experience in data quality validation and exception handling within pipelines.
  • Comfortable working in Agile delivery and CI/CD environments.

Other Requirements

  • Strong collaboration skills in global delivery models (onshore/offshore).
  • Certifications in relevant tools or cloud platforms (e.g., Azure Data Engineer, AWS Big Data) are a plus.
  • Working experience in regulated or large enterprise data environments is desirable.
  • Ability to travel based on project or client needs.

Job Requirements

  • Education
    • BS/MS in Computer Science, Data Engineering, Information Systems, or related field.
  • Experience
    • 8–12+ years of experience in data engineering and system integration, with 3–5 years focused on SAP data pipelines and cloud integration technologies.

What we offer you

develop you with future-focused skills and equip you with world-class experiences.

To help createan equitable

Show more
בואו למצוא את עבודת החלומות שלכם בהייטק עם אקספוינט. באמצעות הפלטפורמה שלנו תוכל לחפש בקלות הזדמנויות Sap Mdm & Data Governance Lead בחברת Ey ב-Germany, Mannheim. בין אם אתם מחפשים אתגר חדש ובין אם אתם רוצים לעבוד עם ארגון ספציפי בתפקיד מסוים, Expoint מקלה על מציאת התאמת העבודה המושלמת עבורכם. התחברו לחברות מובילות באזור שלכם עוד היום וקדמו את קריירת ההייטק שלכם! הירשמו היום ועשו את הצעד הבא במסע הקריירה שלכם בעזרת אקספוינט.