Share
Skillset Required
5+ years of experience in software development, with a strong foundation in distributed systems, cloud-native architectures, and data platforms.
Experience with various stages of software development, including design, development, testing, and deployment.
Strong programming skills in P ython, Java, C++, SQL etc.
Deep expertise with cloud platforms like AWS, Azure, or GCP.
Hands-on experience with Databricks, Snowflake, AWS RDS, Azure SQL, or GCP Cloud SQL, Apache Spark and Apache Airflow .
Strong understanding of Lakehouse architecture, Data Mesh principles, data governance frameworks, and modern data pipelines .
Proven ability to deliver high-impact, API-first services at scale.
Proficiency with tools like Terraform or AWS CloudFormation is critical for managing cloud infrastructure programmatically, ensuring consistency, and enabling automation.
Strong problem-solving and analytical thinking skills.
Experience in guiding and supporting junior engineers.
Excellent communication and collaboration skills.
Design, develop, and maintain scalable data platforms and services.
Develop SDKs, APIs, and microservices to support enterprise-wide data and analytics needs.
Collaborate with cross-functional teams including product managers, and other engineers to deliver high-impact solutions.
Implement best practices in software development, data governance, and platform observability.
Participate in code reviews, provide feedback, and mentor junior engineers.
These jobs might be a good fit