Expoint – all jobs in one place
Finding the best job has never been easier

Sap Mdm & Data Governance Lead jobs at Ey in Germany, Mannheim

Discover your perfect match with Expoint. Search for job opportunities as a Sap Mdm & Data Governance Lead in Germany, Mannheim and join the network of leading companies in the high tech industry, like Ey. Sign up now and find your dream job with Expoint
Company (1)
Job type
Job categories
Job title (1)
Germany
Mannheim
2 jobs found
Yesterday
EY

EY SAP Data Engineer / Integration Specialist Germany, Baden-Württemberg, Mannheim

Limitless High-tech career opportunities - Expoint
Build and maintain ETL/ELT data pipelines and integrations to support SAP BDC migration and transformation workflows. Develop scalable data flows connecting SAP systems with cloud platforms such as Databricks, Azure...
Description:

Job Summary

As a Data Engineer / Integration Specialist within the EY SAP Enterprise Data Management Initiative run by SAP Platforms & Assets, you will be responsible for designing, building, and optimizing scalable data pipelines and system integrations for data management and transformation projects.

You will be part of a global team creating an advanced, cloud-enabled data platform that uses technologies from SAP, Databricks, Snowflake, NVIDIA and Microsoft. Your focus will be to enable seamless data movement, transformation, and integration between SAP systems and modern data platforms, ensuring data availability and quality across multiple environments.

This role requires a hands-on, technically proficient individual with deep experience in both SAP integration and modern cloud-native data engineering.

Essential Functions of the Job

  • Build and maintain ETL/ELT data pipelines and integrations to support SAP BDC migration and transformation workflows.
  • Develop scalable data flows connecting SAP systems with cloud platforms such as Databricks, Azure Synapse, and Snowflake.
  • Integrate SAP data (IDocs, BAPIs, flat files) with modern data lakes, warehouses, and analytical tools using Databricks, Nvidia RAPIDS, and other technologies.
  • Optimize data transformation jobs for performance, reliability, and maintainability in hybrid or multi-cloud setups.
  • Collaborate with architects to ensure integration solutions align with enterprise data strategy and standards.
  • Apply best practices in data security, encryption, data masking, and compliance within integration pipelines.
  • Develop reusable scripts, connectors, and data wrangling logic across SAP and cloud-native environments.
  • Monitor data jobs, troubleshoot failures, and perform root cause analysis to resolve complex data movement issues.
  • Use CI/CD practices to automate deployment of data jobs across dev/test/prod environments.
  • Document technical specifications, mappings, job flows, and operational procedures to support long-term maintainability.

Knowledge and Skills Requirements

  • Deep experience with SAP BDC, data migration, and SAP integration techniques (LSMW, IDocs, BAPIs, BDC recordings).
  • Strong proficiency in ETL tools and frameworks, e.g., SAP BODS, Azure Data Factory, Informatica, or Talend.
  • Hands-on with cloud-based integration services, e.g., AWS Glue, Google Dataflow, Snowflake Tasks/Streams, or Databricks Workflows.
  • Familiarity with cloud data platforms (Azure Synapse, Google BigQuery, Snowflake) and parallel compute frameworks (Nvidia RAPIDS, PySpark).
  • Strong skills in SQL, scripting (Python, Shell), and version control (Git).
  • Knowledge of API integrations, message queues, and event-driven data pipelines.
  • Experience in data quality validation and exception handling within pipelines.
  • Comfortable working in Agile delivery and CI/CD environments.

Other Requirements

  • Strong collaboration skills in global delivery models (onshore/offshore).
  • Certifications in relevant tools or cloud platforms (e.g., Azure Data Engineer, AWS Big Data) are a plus.
  • Working experience in regulated or large enterprise data environments is desirable.
  • Ability to travel based on project or client needs.

Job Requirements

  • Education
    • BS/MS in Computer Science, Data Engineering, Information Systems, or related field.
  • Experience
    • 8–12+ years of experience in data engineering and system integration, with 3–5 years focused on SAP data pipelines and cloud integration technologies.

What we offer you

develop you with future-focused skills and equip you with world-class experiences.

To help createan equitable

Show more
18.11.2025
EY

EY SAP MDM & Data Governance Lead Germany, Baden-Württemberg, Mannheim

Limitless High-tech career opportunities - Expoint
Build and maintain ETL/ELT data pipelines and integrations to support SAP BDC migration and transformation workflows. Develop scalable data flows connecting SAP systems with cloud platforms such as Databricks, Azure...
Description:

Job Summary

As a Data Engineer / Integration Specialist within the EY SAP Enterprise Data Management Initiative run by SAP Platforms & Assets, you will be responsible for designing, building, and optimizing scalable data pipelines and system integrations for data management and transformation projects.

You will be part of a global team creating an advanced, cloud-enabled data platform that uses technologies from SAP, Databricks, Snowflake, NVIDIA and Microsoft. Your focus will be to enable seamless data movement, transformation, and integration between SAP systems and modern data platforms, ensuring data availability and quality across multiple environments.

This role requires a hands-on, technically proficient individual with deep experience in both SAP integration and modern cloud-native data engineering.

Essential Functions of the Job

  • Build and maintain ETL/ELT data pipelines and integrations to support SAP BDC migration and transformation workflows.
  • Develop scalable data flows connecting SAP systems with cloud platforms such as Databricks, Azure Synapse, and Snowflake.
  • Integrate SAP data (IDocs, BAPIs, flat files) with modern data lakes, warehouses, and analytical tools using Databricks, Nvidia RAPIDS, and other technologies.
  • Optimize data transformation jobs for performance, reliability, and maintainability in hybrid or multi-cloud setups.
  • Collaborate with architects to ensure integration solutions align with enterprise data strategy and standards.
  • Apply best practices in data security, encryption, data masking, and compliance within integration pipelines.
  • Develop reusable scripts, connectors, and data wrangling logic across SAP and cloud-native environments.
  • Monitor data jobs, troubleshoot failures, and perform root cause analysis to resolve complex data movement issues.
  • Use CI/CD practices to automate deployment of data jobs across dev/test/prod environments.
  • Document technical specifications, mappings, job flows, and operational procedures to support long-term maintainability.

Knowledge and Skills Requirements

  • Deep experience with SAP BDC, data migration, and SAP integration techniques (LSMW, IDocs, BAPIs, BDC recordings).
  • Strong proficiency in ETL tools and frameworks, e.g., SAP BODS, Azure Data Factory, Informatica, or Talend.
  • Hands-on with cloud-based integration services, e.g., AWS Glue, Google Dataflow, Snowflake Tasks/Streams, or Databricks Workflows.
  • Familiarity with cloud data platforms (Azure Synapse, Google BigQuery, Snowflake) and parallel compute frameworks (Nvidia RAPIDS, PySpark).
  • Strong skills in SQL, scripting (Python, Shell), and version control (Git).
  • Knowledge of API integrations, message queues, and event-driven data pipelines.
  • Experience in data quality validation and exception handling within pipelines.
  • Comfortable working in Agile delivery and CI/CD environments.

Other Requirements

  • Strong collaboration skills in global delivery models (onshore/offshore).
  • Certifications in relevant tools or cloud platforms (e.g., Azure Data Engineer, AWS Big Data) are a plus.
  • Working experience in regulated or large enterprise data environments is desirable.
  • Ability to travel based on project or client needs.

Job Requirements

  • Education
    • BS/MS in Computer Science, Data Engineering, Information Systems, or related field.
  • Experience
    • 8–12+ years of experience in data engineering and system integration, with 3–5 years focused on SAP data pipelines and cloud integration technologies.

What we offer you

develop you with future-focused skills and equip you with world-class experiences.

To help createan equitable

Show more
Limitless High-tech career opportunities - Expoint
Build and maintain ETL/ELT data pipelines and integrations to support SAP BDC migration and transformation workflows. Develop scalable data flows connecting SAP systems with cloud platforms such as Databricks, Azure...
Description:

Job Summary

As a Data Engineer / Integration Specialist within the EY SAP Enterprise Data Management Initiative run by SAP Platforms & Assets, you will be responsible for designing, building, and optimizing scalable data pipelines and system integrations for data management and transformation projects.

You will be part of a global team creating an advanced, cloud-enabled data platform that uses technologies from SAP, Databricks, Snowflake, NVIDIA and Microsoft. Your focus will be to enable seamless data movement, transformation, and integration between SAP systems and modern data platforms, ensuring data availability and quality across multiple environments.

This role requires a hands-on, technically proficient individual with deep experience in both SAP integration and modern cloud-native data engineering.

Essential Functions of the Job

  • Build and maintain ETL/ELT data pipelines and integrations to support SAP BDC migration and transformation workflows.
  • Develop scalable data flows connecting SAP systems with cloud platforms such as Databricks, Azure Synapse, and Snowflake.
  • Integrate SAP data (IDocs, BAPIs, flat files) with modern data lakes, warehouses, and analytical tools using Databricks, Nvidia RAPIDS, and other technologies.
  • Optimize data transformation jobs for performance, reliability, and maintainability in hybrid or multi-cloud setups.
  • Collaborate with architects to ensure integration solutions align with enterprise data strategy and standards.
  • Apply best practices in data security, encryption, data masking, and compliance within integration pipelines.
  • Develop reusable scripts, connectors, and data wrangling logic across SAP and cloud-native environments.
  • Monitor data jobs, troubleshoot failures, and perform root cause analysis to resolve complex data movement issues.
  • Use CI/CD practices to automate deployment of data jobs across dev/test/prod environments.
  • Document technical specifications, mappings, job flows, and operational procedures to support long-term maintainability.

Knowledge and Skills Requirements

  • Deep experience with SAP BDC, data migration, and SAP integration techniques (LSMW, IDocs, BAPIs, BDC recordings).
  • Strong proficiency in ETL tools and frameworks, e.g., SAP BODS, Azure Data Factory, Informatica, or Talend.
  • Hands-on with cloud-based integration services, e.g., AWS Glue, Google Dataflow, Snowflake Tasks/Streams, or Databricks Workflows.
  • Familiarity with cloud data platforms (Azure Synapse, Google BigQuery, Snowflake) and parallel compute frameworks (Nvidia RAPIDS, PySpark).
  • Strong skills in SQL, scripting (Python, Shell), and version control (Git).
  • Knowledge of API integrations, message queues, and event-driven data pipelines.
  • Experience in data quality validation and exception handling within pipelines.
  • Comfortable working in Agile delivery and CI/CD environments.

Other Requirements

  • Strong collaboration skills in global delivery models (onshore/offshore).
  • Certifications in relevant tools or cloud platforms (e.g., Azure Data Engineer, AWS Big Data) are a plus.
  • Working experience in regulated or large enterprise data environments is desirable.
  • Ability to travel based on project or client needs.

Job Requirements

  • Education
    • BS/MS in Computer Science, Data Engineering, Information Systems, or related field.
  • Experience
    • 8–12+ years of experience in data engineering and system integration, with 3–5 years focused on SAP data pipelines and cloud integration technologies.

What we offer you

develop you with future-focused skills and equip you with world-class experiences.

To help createan equitable

Show more
Find your dream job in the high tech industry with Expoint. With our platform you can easily search for Sap Mdm & Data Governance Lead opportunities at Ey in Germany, Mannheim. Whether you're seeking a new challenge or looking to work with a specific organization in a specific role, Expoint makes it easy to find your perfect job match. Connect with top companies in your desired area and advance your career in the high tech field. Sign up today and take the next step in your career journey with Expoint.