Job Description
Batch IntegrationSoftware Engineering
healthcare biopharma companyand be part of a 130-year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare.
Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products.
a
el a sense of belonging to managing critical egencies. And together, we mustconnections and share best practices across the Tech Centers.
What will you do in this role
Develop comprehensive High-Level Technical Design and Data Mapping documents to meet specific business integration requirements.
Own the data integration and ingestion solutions throughout the project lifecycle, delivering key artifacts such as data flow diagrams and source system inventories.
Provide end-to-end delivery ownership for assigned data pipelines, performing cleansing, processing, and validation on the data to ensure its quality.
Define and implement robust Test Strategies and Test Plans, ensuring end-to-end accountability for middleware testing and evidence management.
Collaborate with the Solutions Architecture and Business analyst teams to analyze system requirements and prototype innovative integration methods.
Exhibit a hands-on leadership approach, ready to engage in coding, debugging, and all necessary actions to ensure the delivery of high-quality, scalable products.
Design ETL/ELT solutions, lead comprehensive system and integration testing, and outline standards and architectural toolkits to underpin our data integration efforts.
Analyze data requirements and translate them into technical specifications for ETL processes.
Develop and maintain ETL workflows, ensuringoptimalperformance and error-handling mechanisms are in place.
Monitor and troubleshoot ETL processes to ensuretimelyand successful data delivery.
sand other stakeholders to ensure alignment between data architecture and integration strategies.
Document integration processes, data mappings, and ETL workflows tomaintainclear communication and ensure knowledge transfer
What should you have
Bachelors’ degree in Information Technology, Computer Science or any Technology stream.
years of working experience with enterprise data integration technologies –InformaticaIntelligent Data Management Cloud Services (CDI, CAI, Mass Ingest, Orchestration), PowerCenter
+ years of integration experienceutilizingREST andcustom API integration
+ Years of workingexperience inrelationaldatabase technologies andclouddata stores from AWS, Azure
2+ years of work experienceutilizingAWS cloud well architecture framework, deployment & integration,and data engineering.
ferred experience with CI/CD processes and related tools,including- Terraform, GitHub Actions, Artifactory,
expertisein Python and Shell scripting, with a strong focus onleveragingthese languages for data integration and orchestration tooptimizeworkflows and enhance data processing efficiency
Experience inthedesign of reusable integration patternsusing cloud-native technologies
in process orchestration and Scheduling Integration Jobs in Autosys, Airflow.
Experience in Agile development methodologies and release management techniques
Excellent analytical and problem-solving skills
Good Understanding of data modeling and data architecture principles
What we look for
Current Contingent Workers apply
*A job posting is effective until 11:59:59PM on the dayBEFOREthe listed job posting end date. Please ensure you apply to a job posting no later than the dayBEFOREthe job posting end date.
משרות נוספות שיכולות לעניין אותך