Your Role and Responsibilities
This opportunity is for a new Open Data Lakehouse product that IBM intends to bring to the market. We are looking for seasoned professionals who are passionate about data, familiar with data warehouse, data lake and data Lakehouse products in the market.
- Work in an Agile, collaborative environment to build, deploy, configure, and maintain Lakehouse (SaaS) on multiple hyperscale’s.
- You will work in an innovation driven, collaborative environment to understand requirements, architect, design and implement functionalities / features.
- Use continuous integration tools like Jenkins and Artifactory to build automation pipeline to deploy different service workloads for Lakehouse.
- Collaborate multiple development teams to enable a continuous integration environment that sustains high productivity levels and emphasizes defect prevention techniques.
- Design and implement automation for deployment, monitoring, logging, alerting of large-scale Lakehouse environments.
- You will mentor team members.
Required Technical and Professional Expertise
- Overall 9+ years of IT Industry experience
- At least 3 years of experience with:
- Virtualization, containerization technologies, containers orchestration software and cloud platforms
- Kubernetes clusters administration
- Usage of cloud services (Amazon Web Services, IBM Cloud, Microsoft Azure, Google Cloud Platform)
- Languages: GO, Python, Ruby
- CI/CD Tools: Jenkins, Artifactory
- Development and operation of fully managed SaaS services
- Source Control Tools: Git, GitHub
Preferred Technical and Professional Expertise
- Experience with cloud SaaS security
- Familiarity of RDBMS, data warehouse, data lakes, data lake house
- Familiarity with Hive meta store and open data formats (Iceberg, Delta Lake, Hudi)
- Open-source data engines: Presto, Spark
- Data governance management
- Open-source software development