Expoint – all jobs in one place
מציאת משרת הייטק בחברות הטובות ביותר מעולם לא הייתה קלה יותר

דרושים Junior Data Analyst

תפקיד מתכנת Junior Data Analyst הוא כוכב עולה בשמיי ההייטק, כשמפתחים יכולים לבחור בין מגוון של פרויקטים מעניינים תוך כדי עבודה דינאמית ומאתגרת. בואו למצוא את המשרה הבאה שלכם כמפתחי Junior Data Analyst כאן באקספוינט!
חברה
אופי המשרה
קטגוריות תפקיד
שם תפקיד (1)
Israel
עיר
נמצאו 495 משרות
Yesterday
E

Elementor Product Analyst Israel

Limitless High-tech career opportunities - Expoint
Yesterday
U

Unity Staff Data AI Engineer Israel, Tel-Aviv District, Tel-Aviv

Limitless High-tech career opportunities - Expoint
תיאור:
SuperSonic is hiring a Staff Data AI Lead to lead our AI initiatives.As a Staff Data AI Lead you will be responsible for leading SuperSonic organization AI integration efforts and serving as the bridge between advanced AI technologies and our business needs, implementing cutting-edge Data & AI technologies, creating AI-driven strategies and integrating innovative AI solutions across multiple platforms.
What you'll be doing
  • Develop and execute AI strategies aligned with business objectives
  • Advise leadership on AI capabilities and potential applications
  • Guide teams in adopting AI tools and methodologies
  • Ensure ethical and efficient implementation of AI technologies
  • Design and oversee AI-driven process improvements
  • Collaborate with various departments to identify AI opportunities
  • Stay current with the latest AI trends and advancements
  • Conduct AI-related training and workshops for staff
  • Manage AI projects from conception to implementation
  • Evaluate and recommend AI tools and platforms
  • Leading a team of AI engineers
What we're looking for
  • Deep understanding of AI technologies, including large language models
  • Expertise in prompt engineering and AI-powered automation
  • Proficiency with AI tools such as ChatGPT, Claude, Midjourney, and Copilot
  • Knowledge of AI ethics and regulatory considerations
  • Strong problem-solving and analytical skills
  • Proficiency with Python or TypeScript for building AI workflows and data pipelines
  • Excellent communication and leadership abilities
  • Ability to translate complex AI concepts for non-technical audiences
  • Experience in project management and cross-functional collaboration
You might also have
  • Advanced degree in Computer Science, AI, or related field
  • Previous experience in AI implementation within an organizational setting
  • Certifications in relevant AI technologies or platforms
  • Familiarity with no-code AI application development
  • Bachelor's degree in Computer Science, AI, or related field
Additional information
  • Relocation support is not available for this position.
  • Work visa/immigration sponsorship is not available for this position

This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.

Expand
16.11.2025
U

Unity Senior DevOps Engineer Data Platform Israel, Tel-Aviv District, Tel-Aviv

Limitless High-tech career opportunities - Expoint
תיאור:
The opportunity
  • Technical Leadership & Architecture: Drive data infrastructure strategy and establish standardized patterns for AI/ML workloads, with direct influence on architectural decisions across data and engineering teams
  • DataOps Excellence: Create seamless developer experience through self-service capabilities while significantly improving data engineer productivity and pipeline reliability metrics
  • Cross-Functional Innovation: Lead collaboration between DevOps, Data Engineering, and ML Operations teams to unify our approach to infrastructure as code and orchestration platforms
  • Technology Breadth & Growth: Work across the full DataOps spectrum from pipeline orchestration to AI/ML infrastructure, with clear advancement opportunities as a senior infrastructure engineer
  • Strategic Business Impact: Build scalable analytics capabilities that provide direct line of sight between your infrastructure work and business outcomes through reliable, cutting-edge data solutions
What you'll be doing
  • Design Data-Native Cloud Solutions - Design and implement scalable data infrastructure across multiple environments using Kubernetes, orchestration platforms, and IaC to power our AI, ML, and analytics ecosystem
  • Define DataOps Technical Strategy - Shape the technical vision and roadmap for our data infrastructure capabilities, aligning DevOps, Data Engineering, and ML teams around common patterns and practices
  • Accelerate Data Engineer Experience - Spearhead improvements to data pipeline deployment, monitoring tools, and self-service capabilities that empower data teams to deliver insights faster with higher reliability
  • Engineer Robust Data Platforms - Build and optimize infrastructure that supports diverse data workloads from real-time streaming to batch processing, ensuring performance and cost-effectiveness for critical analytics systems
  • Drive DataOps Excellence - Collaborate with engineering leaders across data teams, champion modern infrastructure practices, and mentor team members to elevate how we build, deploy, and operate data systems at scale
What we're looking for
  • 3+ years of hands-on DevOps experience building, shipping, and operating production systems.
  • Coding proficiency in at least one language (e.g., Python or TypeScript); able to build production-grade automation and tools.
  • Cloud platforms: deep experience with AWS, GCP, or Azure (core services, networking, IAM).
  • Kubernetes: strong end-to-end understanding of Kubernetes as a system (routing/networking, scaling, security, observability, upgrades), with proven experience integrating data-centric components (e.g., Kafka, RDS, BigQuery, Aerospike).
  • Infrastructure as Code: design and implement infrastructure automation using tools such as Terraform, Pulumi, or CloudFormation (modular code, reusable patterns, pipeline integration).
  • GitOps & CI/CD: practical experience implementing pipelines and advanced delivery using tools such as Argo CD / Argo Rollouts, GitHub Actions, or similar.
  • Observability: metrics, logs, and traces; actionable alerting and SLOs using tools such as Prometheus, Grafana, ELK/EFK, OpenTelemetry, or similar.
You might also have
  • Data Pipeline Orchestration - Demonstrated success building and optimizing data pipeline deployment using modern tools (Airflow, Prefect, Kubernetes operators) and implementing GitOps practices for data workloads
  • Data Engineer Experience Focus - Track record of creating and improving self-service platforms, deployment tools, and monitoring solutions that measurably enhance data engineering team productivity
  • Data Infrastructure Deep Knowledge - Extensive experience designing infrastructure for data-intensive workloads including streaming platforms (Kafka, Kinesis), data processing frameworks (Spark, Flink), storage solutions, and comprehensive observability systems
Additional information
  • Relocation support is not available for this position.
  • Work visa/immigration sponsorship is not available for this position

This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.

Expand
15.11.2025
U

Unity Data Platform Engineering Lead Israel, Tel-Aviv District, Tel-Aviv

Limitless High-tech career opportunities - Expoint
תיאור:

Unify online/offline for features: Drive Flink adoption and patterns that keep features consistent and low-latency for experimentation and production.

Make self-serve real: Build golden paths, templates, and guardrails so product/analytics/DS engineers can move fast safely.

Run multi-tenant compute efficiently: EMR on EKS powered by Karpenter on Spot instances; right-size Trino/Spark/Druid for performance and cost.

Cross-cloud interoperability: BigQuery + BigLake/Iceberg interop where it makes sense (analytics, experimentation, partnership).

What you'll be doing
  • Leading a senior Data Platform team: setting clear objectives, unblocking execution, and raising the engineering bar.
  • Owning SLOs, on-call, incident response, and postmortems for core data services.
  • Designing and operating EMR on EKS capacity profiles, autoscaling policies, and multi-tenant isolation.
  • Tuning Trino (memory/spill, CBO, catalogs), Spark/Structured Streaming jobs, and Druid ingestion/compaction for sub-second analytics.
  • Extending Flink patterns for the feature platform (state backends, checkpointing, watermarks, backfills).
  • Driving FinOps work: CUR-based attribution, S3 Inventory-driven retention/compaction, Reservations/Savings Plans strategy, OpenCost visibility.
  • Partnering with product engineering, analytics, and data science & ML engineers on roadmap, schema evolution, and data product SLAs.
  • Leveling up observability (Prometheus/VictoriaMetrics/Grafana), data quality checks, and platform self-service tooling.
What we're looking for
  • 2+ years leading engineers (team lead or manager) building/operating large-scale data platforms; 5+ years total in Data Infrastructure/DataOps roles.
  • Proven ownership of cloud-native data platforms on AWS: S3, EMR (preferably EMR on EKS), IAM, Glue/Data Catalog, Athena.
  • Production experience with Apache Iceberg (schema evolution, compaction, retention, metadata ops) and columnar formats (Parquet/Avro).
  • Hands-on depth in at least two of: Trino/Presto, Apache Spark/Structured Streaming, Apache Druid, Apache Flink.
  • Strong conceptual understanding of Kubernetes (EKS), including autoscaling, isolation, quotas, and observability
  • Strong SQL skills and extensive experience with performance tuning, with solid proficiency in Python/Java.
  • Solid understanding of Kafka concepts, hands-on experience is a plus
  • Experience running on-call for data platforms and driving measurable SLO-based improvements.
You might also have
  • Experience building feature platforms (feature definitions, materialization, serving, and online/offline consistency).
  • Airflow (or similar) at scale; Argo experience is a plus.
  • Familiarity with BigQuery (and ideally BigLake/Iceberg interop) and operational DBs like Aurora MySQL.
  • Experience with Clickhouse / Snowflake / Databricks / Starrocks.
  • FinOps background (cost attribution/showback, Spot strategies).
  • Data quality, lineage, and cataloging practices in large orgs.
  • IaC (Terraform/CloudFormation)
Additional information
  • Relocation support is not available for this position.
  • Work visa/immigration sponsorship is not available for this position

This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.

Expand
14.11.2025
B

Bezeqint תומכ/ת DATA NETWORK Israel

Limitless High-tech career opportunities - Expoint
תיאור:

תיאור המשרה

בזק בינלאומי TECH מגייסת איש/ת תקשורת- תומכ/ת DATA & NETWORKING .

התומכ/ת עובד/ת בצוות המעניק שירות טכני - הפעלת קישורים בינלאומיים ומתן תמיכה טכנית עבור שירותים אלה, ללקוחות הישראליים והגלובאליים של החברה.

בשל העבודה מול לקוחות גלובאליים נדרשת רמה גבוהה באנגלית- ברמת הבנה, אנגלית טכנית, ויכולת ביטוי בע"פ ובכתב.

המטרה היא לספק תמיכה טכנית בעולמות תקשורת הנתונים, ולאפשר ללקוחות אלה קישוריות (זרימת תעבורה תקינה)

בקווי תקשורת ייעודיים שהקמנו עבורם. התמיכה ניתנת מקצה לקצה- הן ברמת הפעלה והקמה של פרויקטים אלו, והן ברמה של תחזוקה, פתרון תקלות (כגון נפילת קווים, LATENCY, PACKET LOSS ועוד) ומתן שירות שוטף.

השירות ללקוחות ניתן דרך מענה מרחוק, ונדרשת יכולת עבודה מול ממשקים מגוונים- פנימיים וחיצוניים- כדוגמת גורמי מכירה בתוך האגף, מנהלי פרויקטים, מפעילים זרים בחול (כגון BT, AT&T, Verizon ועוד).

שעות פעילות הצוות הן 8:00-19:00

דרישות

  • היכרות וניסיון בעולמות התקשורת, לרבות פרוטוקולי תקשורת BGP) WAN) ומודל שבע השכבות- חובה
  • היכרות עם ציודי סיסקו וג'וניפר - ברמת קינפוג, שירות - חובה
  • שליטה גבוהה בשפה האנגלית כתיבה ודיבור - חובה
  • נכונות לביצוע כוננויות (בעיקר דרך השתלטות מרחוק)- חובה
  • שתף:
Expand
13.11.2025
ORC

ORCA Security Data/Product Analyst Israel, Tel-Aviv District, Tel-Aviv

Limitless High-tech career opportunities - Expoint
תיאור:
Highlights
  • High-growth: Over the past six years, we’ve consistently achieved milestones that take other companies a decade or more. During this time, we’ve significantly grown our employee base, expanded our customer reach, and rapidly advanced our product capabilities.
  • Disruptive innovation: Our founders saw that traditional security didn’t work for the cloud, so they set out to carve a new path. We’re relentless pioneers who invented agentless technology and continue to be the most comprehensive and innovative cloud security company.
  • Well-capitalized: With a valuation of $1.8 billion, Orca is a cybersecurity unicorn dominating the cloud security space. We’re backed by an impressive team of investors such as Capital G, ICONIQ, GGV, and SVCI, a syndicate of CISOs who invest their own money after conducting their due diligence.
  • Respectful and transparent culture: Our executives pride themselves on being accessible to everyone and believe in sharing knowledge with the employees. Each employee has a place in shaping the future of our industry.
About the role

As a Product & Data Analyst at Orca, you will take ownership of product analytics at Orca, along with responsibility over our BI infrastructure. You will help product teams make smarter decisions by uncovering customer insights, while also ensuring the data pipelines and models that power those insights are robust and scalable.

In this role, you will work directly with Product Managers to analyze customer behavior, evaluate new features, and measure product impact. You’ll also take responsibility for the underlying data stack-writing SQL, modeling in DBT, and maintaining clean, efficient ETLs. Beyond supporting day-to-day product questions, you will build dashboards and reporting tools that give teams across the company access to reliable, self-serve insights.

About you
  • 3-5 Years of experience as a Data Analyst, preferably from a SaaS company
  • Proven experience in SQL writing, specifically in querying large and complex data sets
  • Experience in BI tools and in DBT/Python
  • Looker proficiency – an advantage
  • Ability to translate analytic conclusions to business insights and actions along with strong statistical analysis skills and self-learning capabilities
  • B.Sc./BA in industrial/information systems engineering, statistics or equivalent
  • Experience in operational or cross-department roles – an advantage
  • Great communication skills
  • Business orientation and a passion for data analytics
  • High level of spoken and written English
Expand
Limitless High-tech career opportunities - Expoint
תיאור:
Responsibilities
  • Partner with Product and R&D teams to define success metrics and measure product performance.
  • Analyze user journeys, A/B tests, and feature adoption to identify insights that drive growth.
  • Build dashboards and reports that make complex data accessible and actionable.
  • Collaborate across departments to turn insights into meaningful business and product impact.
  • Leverage advanced analytics tools to support an AI-first, data-informed culture.
Requirements
  • 4+ years of experience as a Product Analyst in a web or SaaS company.
  • Proven ability to translate complex data into clear narratives and strategic recommendations.
  • Strong SQL skills and experience with BI tools (Looker, Tableau, Power BI, etc.).
  • Understanding of experimentation methodologies (A/B testing, funnel analysis, etc.).
  • Experience with Python, R, or another scripting language - a plus.
  • Curiosity, self-motivation, and a passion for learning fast and iterating even faster.

Why you’ll love working here

You’ll find a team that’s driven by data, powered by creativity, and truly cares about its people.

We believe in flexibility, autonomy, and balance - it’s about impact, not hours.

Expand
מגוון רחב של משרות כמו Junior Data Analyst. למצוא עבודה בחברות נבחרות כבר לא יהיה חלום. Expoint מסייעת לכם למצוא את המשרות הנחשקות במגוון רחב של מדינות המובילות בעולם בהן תוכלו למצוא תפקיד מאתגר במדינה שיהיה לכם כיף לעבוד בה.