Share
Skillset Required
7+ years of experience in software development, with a strong foundation in distributed systems, cloud-native architectures, and data platforms.
Expertise in big data technologies such as Apache Spark and real-time streaming technologies like Apache Kafka .
Advanced knowledge of a major cloud platform (AWS, Azure, GCP) and its ecosystem of data services.
Proficiency with Infrastructure as Code (IaC) tools like Terraform or AWS CloudFormation .
Strong understanding of advanced data modeling techniques and modern data warehouses.
Ability to design scalable, fault-tolerant, and maintainable distributed systems .
Excellent communication and stakeholder management skills.
Design, build, deploy, and own complete, scalable, and reliable data platforms and solutions.
Lead the technical design of new data pipelines, services, and systems .
Build reusable frameworks, libraries, and data management tools to improve productivity.
Optimize performance and cost-efficiency of data workflows and compute layers .
Collaborate with stakeholders such as data scientists, business analysts, and product managers to understand and translate business requirements into technical solutions .
Provide technical leadership and mentorship to junior engineers.
Ensure the uptime, reliability, and monitoring of the systems you build and own.
These jobs might be a good fit