Job Description
What will you do in this role:
- Develop comprehensive High-Level Technical Design and Data Mapping documents to meet specific business integration requirements.
- Own the data integration and ingestion solutions throughout the project lifecycle, delivering key artifacts such as data flow diagrams and source system inventories.
- Provide end-to-end delivery ownership for assigned data pipelines, performing cleansing, processing, and validation on the data to ensure its quality.
- Define and implement robust Test Strategies and Test Plans, ensuring end-to-end accountability for middleware testing and evidence management.
- Collaborate with the Solutions Architecture and Business analyst teams to analyze system requirements and prototype innovative integration methods.
- Exhibit a hands-on leadership approach, ready to engage in coding, debugging, and all necessary actions to ensure the delivery of high-quality, scalable products.
- Influence and drive cross-product teams and collaboration while coordinating the execution of complex, technology-driven initiatives within distributed and remote teams.
- Work closely with various platforms and competencies to enrich the purpose of Enterprise Integration and guide their roadmaps to address current and emerging data integration and ingestion capabilities.
- Design ETL/ELT solutions, lead comprehensive system and integration testing, and outline standards and architectural toolkits to underpin our data integration efforts.
- Analyze data requirements and translate them into technical specifications for ETL processes.
- Develop and maintain ETL workflows, ensuring optimal performance and error handling mechanisms are in place.
- Monitor and troubleshoot ETL processes to ensure timely and successful data delivery.
- Collaborate with data analyst and other stakeholders to ensure alignment between data architecture and integration strategies.
- Document integration processes, data mappings, and ETL workflows to maintain clear communication and ensure knowledge transfer.
What should you have:
- Bachelor’s degree in information technology, Computer Science or any Technology stream
- 5+ years of working experience with enterprise data integration technologies – Informatica PowerCenter, Informatica Intelligent Data Management Cloud Services (CDI, CAI, Mass Ingest, Orchestration)
- Integration experience utilizing REST and Custom API integration
- Experiences in Relational Database technologies and Cloud Data stores from AWS, GCP & Azure
- Experience utilizing AWS cloud well architecture framework, deployment & integration and data engineering.
- Preferred experience with CI/CD processes and related tools including- Terraform, GitHub Actions, Artifactory etc.
- Proven expertise in Python and Shell scripting, with a strong focus on leveraging these languages for data integration and orchestration to optimize workflows and enhance data processing efficiency
- Extensive Experience in design of reusable integration pattern using the cloud native technologies
- Extensive Experience Process orchestration and Scheduling Integration Jobs in Autosys, Airflow.
- Experience in Agile development methodologies and release management techniques
- Excellent analytical and problem-solving skills
- Good Understanding of data modeling and data architecture principles
Current Contingent Workers apply
*A job posting is effective until 11:59:59PM on the dayBEFOREthe listed job posting end date. Please ensure you apply to a job posting no later than the dayBEFOREthe job posting end date.