- Build Data Mesh that facilitate seamless data integration across different environments, such as APIs, data virtualization, and integration platforms like Apache NiFi or Talend.
- Design Event-Driven Architectures: Experience in designing and implementing event-driven architectures that support real-time data flows.
- Develop Data Orchestration: Knowledge of orchestration tools such as Apache Airflow or Prefect to automate and manage data workflows.
- Design and implement self-service data infrastructure that empowers teams to manage their data independently.
- Familiarity with open standards and protocols to ensure data interoperability across domains.
- Design Metadata Management which proficiency in managing and utilizing metadata to enhance data discovery, lineage, and governance. Familiarity with tools like Apache Atlas or Collibra is beneficial.
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field, with 5+ years of experience in big data engineering or a similar role.
- Strong proficiency in Java and Spring Boot for developing and deploying data applications.
- Extensive experience with Apache Spark and Apache Flink for both batch and stream data processing.
- Experience with open source solutions like DBT, Sql Mesh is a plus.
- Familiarity with metadata management, data linkage, and data governance frameworks to ensure data integrity and knowledge discovery is a plus
This website uses cookies to enhance your experience. By continuing to browse the site, you agree to our use of cookies. Visit our for more information.