The point where experts and best companies meet
Share
What you`ll do-
You will work in the context of the SAP Business Technology Platform (BTP) which provides the multi-cloud foundation and a layer of abstraction for all applications to run on both public clouds (AWS, Azure, GCP, AliCloud) and private cloud. More specifically you will work on specific data persistency services.
Design, code, test and assure quality of complex product features with team members.
Resolve complex issues within own area of expertise and support others.
Work in multi-cloud environment within DevOps setup.
Use Infrastructure as Code methodologies to automate and manage infrastructure and services.
Learn new technologies/patterns and incorporate into the product with minimal disruption.
Perform self-code review and peer review.
Define and formulate coding best practices and guidelines for clean code.
Initiate, collaborate and inculcate positive development practices into an agile team.
Work with architects in designing and delivering robust, secure cloud-native applications.
What you bring:
Excellent university degree (bachelor, diploma, master, PhD) in computer science or related engineering discipline.
3+ years of relevant industry experience.
Sound understanding of cloud-native architecture, development, design and cloud platforms.
Good knowledge of software design patterns including microservices and integrated systems.
Excellent hands-on proficiency in JAVA.
Experience running systems on container-based platforms (Kubernetes and/or Cloud Foundry).
Confidence working within Unix / Linux shell environments with focus on BASH/Python for automating scripting.
IaaS experience with at least one of AWS, Azure, GCP.
Experience working with CI/CD.
Excellent communication and interpersonal skills with fluency in English (written & spoken).
Experience and willingness to work in a project setup following agile principles and practices.
You will work with your team on the SAP HANA Cloud & Datalakehouse/Big Data Fabric Services. This product helps our customers access data from anywhere. The Big Data Fabric Services team is responsible for:
Extend and complete the SAP Data Platform with big data technologies to assist different SAP Lines of Business.
Leverage Apache Spark and Object store technology to provide scalable and cost-effective computing and storage services for big data ingestion, processing, and sharing.
Integrate into SAP BTP and HANA Cloud & Data Lakehouse to offer unified development and deployment experience for different SAP Lines of Business.
Assure multi-tenancy and secure data processing.
In the BDFS team, we have an agile and collaborative culture where knowledge sharing is fostered
Job Segment:Cloud, Developer, Java, ERP, Testing, Technology
These jobs might be a good fit