Share
Job Description
healthcare biopharma companyand be part of a 130-year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare.
Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products.
a
Our TechnologyIT operating model, Tech
A focused group of leaders in each Techconnections and share best practices across the Tech
What will you do in this role
Develop comprehensive High-Level Technical Design and Data Mapping documents to meet specific business integration requirements.
Own the data integration and ingestion solutions throughout the project lifecycle, delivering key artifacts such as data flow diagrams and source system inventories.
Provide end-to-end delivery ownership for assigned data pipelines, performing cleansing, processing, and validation on the data to ensure its quality.
Define and implement robust Test Strategies and Test Plans, ensuring end-to-end accountability for middleware testing and evidence management.
Collaborate with the Solutions Architecture and Business analyst teams to analyze system requirements and prototype innovative integration methods.
Exhibit a hands-on leadership approach, ready to engage in coding, debugging, and all necessary actions to ensure the delivery of high-quality, scalable products.
Influence and drive cross-product teams and collaboration while coordinating the execution of complex, technology-driven initiatives within distributed and remote teams.
Work closely with various platforms and competencies to enrich the purpose of Enterprise Integration and guide their roadmaps to address current and emerging data integration and ingestion capabilities.
Design ETL/ELT solutions, lead comprehensive system and integration testing, and outline standards and architectural toolkits to underpin our data integration efforts.
Analyze data requirements and translate them into technical specifications for ETL processes.
Develop and maintain ETL workflows, ensuringoptimalperformance and error handling mechanisms are in place.
Monitor and troubleshoot ETL processes to ensuretimelyand successful data delivery.
Collaborate with data analyst and other stakeholders to ensure alignment between data architecture and integration strategies.
Document integration processes, data mappings, and ETL workflows tomaintainclear communication and ensure knowledge transfer.
What should you have
Bachelor's degree in information technology, ComputerScienceor any Technology stream.
8+ years of working experience with enterprise data integration technologies –InformaticaIntelligent Data Management Cloud Services (CDI, CAI, Mass Ingest, Orchestration)andPowerce
5+ years of integration experienceutilizingREST andcustom API integration
8+ Years of workingexperience inrelationaldatabase technologies andclouddata stores from AWS& Azure
2+ years of work experienceutilizingAWS cloud well architecture framework, deployment & integration,and data engineering.
Preferred experience with CI/CD processes and related tools,including- Terraform, GitHub Actions, Artifactory,
expertisein Python and Shell scripting, with a strong focus onleveragingthese languages for data integration and orchestration tooptimizeworkflows and enhance data processing efficiency
experience inthedesign of reusable integration patternsusing cloud-native technologies
experiencein process orchestration and Scheduling Integration Jobs in Autosys, Airflow.
Experience in Agile development methodologies and release management techniques
Excellent analytical and problem-solving skills
Good Understanding of data modeling and data architecture principles
What we look for
Current Contingent Workers apply
*A job posting is effective until 11:59:59PM on the dayBEFOREthe listed job posting end date. Please ensure you apply to a job posting no later than the dayBEFOREthe job posting end date.
These jobs might be a good fit