מציאת משרת הייטק בחברות הטובות ביותר מעולם לא הייתה קלה יותר
What You'll Be Doing:
Architect and implement efficient data pipelines to ingest, transform, and store operational telemetry.
Develop robust data models, schemas, and warehouses to ensure consistent reporting and traceability.
Build APIs and query layers (e.g., GraphQL, REST) for seamless data access and integration across systems.
Partner with engineering and operations teams to centralize telemetry, improve reporting accuracy, and enhance visibility.
Monitor data freshness, identify bottlenecks, and propose automation to reduce manual toil.
Provide expertise in reporting tools (e.g., Power BI, Tableau, Superset) to design actionable dashboards.
Mentor teams on data best practices, driving insights and operational efficiency through intelligent data systems.
What We Need To See:
5+ years of industry experience.
Bachelor’s or Master’s degree or equivalent experience.
Proven expertise in building and optimizing data pipelines using tools like Apache Spark, Airflow, or Kafka.
Proficiency in programming languages such as Python, Java, or Scala.
Strong experience with relational and NoSQL databases (e.g., Snowflake, BigQuery, PostgreSQL).
Familiarity with cloud storage and infrastructure (AWS S3, GCP BigQuery, Azure Data Lake).
Deep knowledge of data modeling, schema design, and data warehousing for high-scale systems.
Expertise in using reporting systems like Power BI, Tableau, or Superset, and guiding teams to derive actionable insights.
Excellent debugging and problem-solving skills in complex, distributed environments.
Ways to Stand Out from the Crowd:
Experience building real-time telemetry pipelines and integrating data into operational workflows.
A proven track record of driving standardization and governance in data systems.
Familiarity with workflow orchestration tools and their integration into broader business processes.
You will also be eligible for equity and .
משרות נוספות שיכולות לעניין אותך