What you'll do
In your role as a Developer, you will be responsible for the design, coding, testing, and quality assurance in a development team. You will assess and solve issues in new or existing code, and work with high attention to detail, reliability, and efficiency. You will collaborate closely with your team members to ensure success. In your day-to-day, your tasks will be:
- Work on design, coding and quality assurance tasks of specific product features.
- Carry out research and development activities in cloud computing, by working with cutting-edge technologies in this field like: Go, Python, Spark, using a microservices architecture.
- Be in close touch with the best minds in the industry in our region.
- Learn about state-of-art cloud deployment technologies, such as Docker and Kubernetes.
- Learn how continuous integration works in a global development environment.
- Develop distributed applications that run on cluster mode.
- Work in an agile team, that uses Scrum framework to self-organize and work towards the development goals.
- Be part of a DevOps culture where the CI/CD practices are put in practice.
- Be part of a globally distributed group that has an enormous impact in the Enterprise Software industry on a global scale.
What you bring
Are you analytical, self-motivated, and relish problem solving? Are you skilled in time management and task prioritization? Do you relish continuous learning and working efficiently in a fast-paced environment? If this sounds like you, do you also bring:
- Bachelor’s degree or undergraduate studies in any of the following areas: Computer Science, Systems Analysis, Information Technology, Mathematics, Physics, Engineering
- Ideally, you bring one year of experience in a related position, but new graduates and undergraduates are welcome.
- Strong communication skills both written and spoken in English,
- Programming experience in Golang, Python,
- Some experience in Docker, Kubernetes and Apache Spark
The Big Data Fabric Services team is responsible for:
- Extend and complete SAP Data Platform with Big Data technologies to support different SAP line of business.
- Leverage Apache Spark and Object store technology to provide scalable and cost-effective compute and storage services for big data ingestion, processing and sharing.
- Integrate into SAP BTP and HANA Cloud & Data Lake to offer unified development and deployment experience for different SAP line of business.
- Assure multi-tenancy and secure data processing