

Your key responsibilities
Data Profiling & Analysis
Execute comprehensive data profiling on source systems to identify data quality issues, patterns, and anomalies
Analyze large-scale datasets (millions of records) using statistical techniques for distributions, null rates, and outliers
Profile customer data to support deduplication and data matching requirements
Document data quality findings and contribute to baseline metrics reporting
Conduct cross-system data comparison to identify overlaps, conflicts, and data inconsistencies
Perform root cause analysis for data quality failures using systematic methodologies
Create Pareto charts and defect distribution analysis to support prioritization of remediation efforts
Data Quality Rules & Validation
Implement and maintain data quality rules across multiple business domains based on defined requirements
Develop validation logic for business-specific rules including calculations, limits, and regulatory requirements
Configure referential integrity checks across related data entities
Build lookup validation rules to prevent mapping and definition mismatches
Execute attribute-level validation to support high success rates with regression detection
Test and validate data quality rules in development and test environments before production deployment
Data Cleansing & Transformation
Implement data cleansing rules for standardization, normalization, and enrichment based on specifications
Apply address standardization and validation rules using industry-standard references
Execute name parsing and normalization processes for improved data matching accuracy
Support engineering teams with ETL transformation logic and data mapping validation
Implement data quality checkpoints within data pipelines (pre-transformation, post-transformation, pre-load)
Validate that exception handling and error routing mechanisms work correctly for data quality failures
Support customer deduplication processes with data quality validation for matching and merge operations
Data Quality Monitoring & Reporting
Execute real-time data quality monitoring across all data processing stages
Develop and maintain automated workflows for continuous data quality validation using orchestration tools
Configure alerting mechanisms for data quality threshold violations and degradation patterns
Build and maintain data quality dashboards using PowerBI or Tableau for stakeholder visibility
Track comprehensive data quality metrics including attribute success rates, volumetric reconciliation, and financial accuracy
Create technical dashboards with drill-down capabilities for root cause investigation
Contribute totrend analysis visualizations to support regression pattern detection
Documentation & Data Lineage
Document data quality test cases, validation procedures, and testing results
Maintain data quality runbooks for issue resolution and troubleshooting
Support data lineage documentation showing transformation points and validation checkpoints
Contribute to data quality assessment reports for stakeholder review
Update lessons learned repository with data quality insights from testing activities
Maintain up-to-date documentation of data quality rules, validation logic, and test coverage
Collaboration & Stakeholder Management
Collaborate with test automation engineers on data validation strategies
Work closely with data architects and ETL developers to understand data flows and transformation logic
Partner with business analysts to translate business requirements into data quality validation rules
Participate in defect triage meetings and provide data quality analysis
Present data quality findings to technical and business stakeholders
Support UAT activities by providing data quality insights to business subject matter experts
Ensure clear and consistent communication with all stakeholders throughout the data quality lifecycle
Skills and attributes for success
Experience & Education
4-7 years of hands-on experience in data quality engineering or data analysis
Experience in large-scale data migration programs with millions of records
Bachelor's degree in Computer Science, Information Systems, Data Science, Engineering, or related field (preferred)
Data Quality Expertise
Strong understanding of data quality dimensions: completeness, accuracy, consistency, validity, timeliness, uniqueness
Experience designing and implementing data quality frameworks and validation rules
Proficiency in data profiling techniques and statistical analysis
Knowledge of data cleansing, standardization, and normalization methodologies
Experience with data reconciliation frameworks (volumetric, financial, attribute-level)
Technical Skills
Advanced SQL skills for complex data validation queries across multiple databases
Proficiency in Python for data quality automation such as pandas, PyTest, sqlalchemy
Experience with data quality tools such as Great Expectations, PyDeequ, or enterprise DQ platforms (e.g. Informatica, Talend)
Knowledge of data warehouse platforms such as Snowflake, Databricks, Redshift
Experience with cloud technologies such as AWS, Azure, GCP for data processing
Familiarity with ETL/ELT tools (AWS Glue, Apache Airflow, Databricks)
Version control with Git and CI/CD pipeline integration
Data Analysis & Visualization
Experience creating dashboards and visualizations using PowerBI, Tableau, or similar tools
Strong analytical skills to identify patterns, trends, and anomalies in large datasets
Ability to perform statistical analysis and create meaningful metrics and KPIs
Experience with data visualization best practices for technical and executive audiences
Methodologies & Processes
Solid understanding of the software development lifecycle (SDLC) and Agile methodologies
Experience with data governance principles and frameworks
Knowledge of regulatory compliance requirements (e.g. data protection standards)
Root cause analysis and problem-solving methodologies
Strong interest in continuous improvement and lessons learned application
Soft Skills & Work Style
Strong problem-solving skills and attention to detail
Excellent command of English, both written and spoken
Ability to communicate complex technical concepts to non-technical stakeholders
Self-driven and flexible, can work autonomously with proven work ethic
Team player who enjoys working with people from different backgrounds and disciplines
Ability to work in a dynamic environment with excellent organizational and time management skills
Able to exhibit a high level of confidentiality
It will be a plus if you have:
Domain Knowledge
Experience in financial services, insurance, banking, or highly regulated industries
Understanding of insurance policy lifecycle and claims workflows
Knowledge of customer data management and master data management (MDM) principles
Familiarity with regulatory requirements (e.g. PCI-DSS)
Advanced Technical Capabilities
Experience with PySpark for large-scale data processing
Knowledge of machine learning techniques for data quality improvement (anomaly detection, predictive quality)
Experience with Docker and Kubernetes for containerized data quality processes
Familiarity with data masking, anonymization, and synthetic data generation
Knowledge of Infrastructure as Code tools (Terraform, CloudFormation)
Data Quality Tools & Platforms
Hands-on experience with enterprise data quality platforms (e.g. Informatica DQ, Talend)
Experience with open-source data quality frameworks (e.g. Great Expectations, Deequ, Soda)
Knowledge of data catalog tools (e.g. Collibra, Alation, Apache Atlas)
Experience with data observability platforms (e.g. Monte Carlo, Datadog)
Migration & Transformation Experience
Previous involvement in large-scale data migration programs (1M+ records)
Experience with merger and acquisition data integration projects
Understanding of customer deduplication and entity resolution challenges
Knowledge of legacy system modernization and cloud migration patterns
Data quality or data management certifications suchasCDMP, DGSP
Cloud certifications suchasAWS Certified Data Analytics, Azure Data Engineer, GCP Data Engineer
Snowflake or Databricks certifications
ISTQB or software testing certifications
What we offer you
In addition to a competitive salary, our benefits include but are not limited to:
13th salary
Provident Fund
Private Medical and Life Insurance
Flexible working arrangements (hybrid work and flexible work schedule)
Friday afternoon off
EY Tech MBA and EY MSc in Business Analytics
EY Badges - digital learning certificates
Mobility programs (if interested to work abroad)
Paid Sick Leave
Paid Paternity Leave
Yearly wellbeing days off
Maternity, Wedding and New Baby Gifts
EY Employee Assistance Program (EAP) (counselling, legal and financial consultation services)
EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.
If you can demonstrate that you meet the criteria above, please contact us as soon as possible.
משרות נוספות שיכולות לעניין אותך