Your Role and ResponsibilitiesThe capability is the next generation of distributed, highly parallelized, enterprise data systems capabilities so key to the industry’s future. You’ll work alongside local and global (in Asia, Europe and North America) multidisciplinary teams of Developers, Architects, Engineers, Dev Ops and Analysts creating the future of enterprise data management in the context of AI.
The successful candidate will be expected to deliver incremental value to our product in their area of expertise to deliver product on a regular cadence. Our product must be ship ready on our sprint boundaries, usually a two-week cadence. You will be co-owner with senior leaders in architectural activities such as requirements gathering and solution definition to meet our customers’ needs.
Embracing your inner drive, persevering through challenges, and striving for excellence are the powerful ingredients that can propel you towards achieving your goals and realizing your full potential. You will be a technical visionary for the technology area you work in.
Architectural actives include requirements gathering and solution architecture definition to meet our customers’ needs. You will drive patentable value add to our offerings
Required Technical and Professional Expertise
- Advanced software development skills: Expert in one or more programming languages (e.g., Java, C, C++, Python, Go Lang) with a strong understanding of software development lifecycle, design, and implementation.
- Solid understanding of high performance, distributed systems implementation techniques.
- Backend development expertise: Expert in developing, maintaining, and supporting sophisticated software systems, with a focus on high-quality code, Agile methodologies, and collaborative development practices.
- Technical leadership and collaboration: Proven ability to lead design and implementation of software components, work with cross-functional teams (e.g., Product Architects, Product Managers), and participate in code reviews to ensure software quality and knowledge sharing.
- DevOps and continuous delivery: Knowledgeable about containerized technologies (e.g., Docker, Kubernetes, OpenShift), network protocols (e.g., TCP/IP, HTTP), and version control systems (e.g., Github, Maven/Gradle), with experience in automated testing, load/performance testing, and technical documentation.
Preferred Technical and Professional Expertise
- BSc or MSc in a technical discipline
- Extensive experience in C/C++ and/or Java development, with a deep understanding of OOP concepts
- Advanced knowledge of distributed / HPC computing and BigData & Hadoop software stack: HDFS, Hive, HBase, Ambari
- Proven experience compilers and / or RDBMs, with **strong skills** in system & low-level programming (TCP/IP, multi-threading, IPC) on the Linux platform.
- Familiarity with Agile software development methodologies, and proficiency in developing and deploying on Cloud environments (e.g., AWS, IBM Cloud, Google Cloud, etc.).
- Advantageous experience with federated data technologies like Gaian Db, and desirable experience with connectivity and ETL from multiple data sources.
- Established expertise in leading globally distributed product-based engineering teams, with a proven track record of developing managing complex software systems.