Job Description
The Opportunity
- Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare.
- Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products.
- Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats.
What will you do in this role
- Design, develop, and maintain data pipelines to extract data from various sources and populate a data lake and data warehouse.
- Work closely with IT risk analysts to understand data requirements and deliver solutions aligned with business goals.
- Build and maintain platforms that support data ingestion, transformation, and orchestration across various data sources, both internal and external.
- Use data orchestration, logging, and monitoring tools to build resilient pipelines.
- Automate data flows and pipeline monitoring to ensure scalability, performance, and resilience of the DI platform.
- Monitor, troubleshoot, and resolve issues related to the data integration platform, ensuring uptime and reliability.
- Maintain thorough documentation for integration processes, configurations, and code to ensure easy onboarding for new team members and future scalability.
- Develop pipelines to ingest data into cloud data warehouses.
- Establish, modify and maintain data structures and associated components.
- Create and deliver reports in accordance with stakeholder needs and conforming to agreed standards.
- Work within a matrix organizational structure, reporting to both the functional manager and the project manager.
- Participate in project planning, execution, and delivery, ensuring alignment with both functional and project goals.
What should you have
- Bachelors’ degree in Information Technology, Computer Science or any Technology stream.
- 3+ years of developing data pipelines & data infrastructure, an IT risk or cyber risk context.
- Demonstrated expertise indata integration and self-service analytics enablement.
- Experienced in software/data engineering practices (including GitHub versioning, release management, deployment of datasets, agile & related software tools).
- Strong knowledge of databases, and underlying technology (cloud or on-prem environments, containerization, distributed storage & databases)
- Cloud-native, ideally AWS certified.
- Working knowledge of Python for data manipulation , APIs for data ingestion and experience in Power BI for Stake holder reporting , Python scripting .
- Good interpersonal and communication skills (verbal and written).
- Proven record of delivering high-quality results.
- Product and customer-centric approach.
- Innovative thinking, experimental mindset.
Working Hours:11 AM to 8 PM IST
What we look for
Current Contingent Workers apply
*A job posting is effective until 11:59:59PM on the dayBEFOREthe listed job posting end date. Please ensure you apply to a job posting no later than the dayBEFOREthe job posting end date.