Expoint – all jobs in one place
מציאת משרת הייטק בחברות הטובות ביותר מעולם לא הייתה קלה יותר

דרושים Data Scientist - משרות מדען נתונים

למדעני נתונים תפקיד חשוב בהתפתחות הבינה המלאכותית ולמידת מכונות, כשעליהם לבנות אלגוריתמים מתמטיים שמנבאים מודל התנהגותי. התפקיד נחשב לאחד היוקרתיים והחשובים בעיצוב הטכנולוגיה והחדשנות. בואו לקחת חלק בענף שמחולל מהפכה בעולם ולמצוא עבודה בתחום. עם אקספוינט תוכלו למצוא את המשרה הבאה שלכם בקלות ובמהירות!
חברה
אופי המשרה
קטגוריות תפקיד
שם תפקיד (1)
Israel
עיר
נמצאו 350 משרות
23.11.2025
U

Unity Senior DevOps Engineer Data Platform Israel, Tel-Aviv District, Tel-Aviv

Limitless High-tech career opportunities - Expoint
Technical Leadership & Architecture: Drive data infrastructure strategy and establish standardized patterns for AI/ML workloads, with direct influence on architectural decisions across data and engineering teams. DataOps Excellence: Create seamless...
תיאור:
The opportunity
  • Technical Leadership & Architecture: Drive data infrastructure strategy and establish standardized patterns for AI/ML workloads, with direct influence on architectural decisions across data and engineering teams
  • DataOps Excellence: Create seamless developer experience through self-service capabilities while significantly improving data engineer productivity and pipeline reliability metrics
  • Cross-Functional Innovation: Lead collaboration between DevOps, Data Engineering, and ML Operations teams to unify our approach to infrastructure as code and orchestration platforms
  • Technology Breadth & Growth: Work across the full DataOps spectrum from pipeline orchestration to AI/ML infrastructure, with clear advancement opportunities as a senior infrastructure engineer
  • Strategic Business Impact: Build scalable analytics capabilities that provide direct line of sight between your infrastructure work and business outcomes through reliable, cutting-edge data solutions
What you'll be doing
  • Design Data-Native Cloud Solutions - Design and implement scalable data infrastructure across multiple environments using Kubernetes, orchestration platforms, and IaC to power our AI, ML, and analytics ecosystem
  • Define DataOps Technical Strategy - Shape the technical vision and roadmap for our data infrastructure capabilities, aligning DevOps, Data Engineering, and ML teams around common patterns and practices
  • Accelerate Data Engineer Experience - Spearhead improvements to data pipeline deployment, monitoring tools, and self-service capabilities that empower data teams to deliver insights faster with higher reliability
  • Engineer Robust Data Platforms - Build and optimize infrastructure that supports diverse data workloads from real-time streaming to batch processing, ensuring performance and cost-effectiveness for critical analytics systems
  • Drive DataOps Excellence - Collaborate with engineering leaders across data teams, champion modern infrastructure practices, and mentor team members to elevate how we build, deploy, and operate data systems at scale
What we're looking for
  • 3+ years of hands-on DevOps experience building, shipping, and operating production systems.
  • Coding proficiency in at least one language (e.g., Python or TypeScript); able to build production-grade automation and tools.
  • Cloud platforms: deep experience with AWS, GCP, or Azure (core services, networking, IAM).
  • Kubernetes: strong end-to-end understanding of Kubernetes as a system (routing/networking, scaling, security, observability, upgrades), with proven experience integrating data-centric components (e.g., Kafka, RDS, BigQuery, Aerospike).
  • Infrastructure as Code: design and implement infrastructure automation using tools such as Terraform, Pulumi, or CloudFormation (modular code, reusable patterns, pipeline integration).
  • GitOps & CI/CD: practical experience implementing pipelines and advanced delivery using tools such as Argo CD / Argo Rollouts, GitHub Actions, or similar.
  • Observability: metrics, logs, and traces; actionable alerting and SLOs using tools such as Prometheus, Grafana, ELK/EFK, OpenTelemetry, or similar.
You might also have
  • Data Pipeline Orchestration - Demonstrated success building and optimizing data pipeline deployment using modern tools (Airflow, Prefect, Kubernetes operators) and implementing GitOps practices for data workloads
  • Data Engineer Experience Focus - Track record of creating and improving self-service platforms, deployment tools, and monitoring solutions that measurably enhance data engineering team productivity
  • Data Infrastructure Deep Knowledge - Extensive experience designing infrastructure for data-intensive workloads including streaming platforms (Kafka, Kinesis), data processing frameworks (Spark, Flink), storage solutions, and comprehensive observability systems
Additional information
  • Relocation support is not available for this position.
  • Work visa/immigration sponsorship is not available for this position

This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.

Show more
23.11.2025
F

Forter Data ResearcherNew Israel, Tel Aviv District, Tel Aviv-Yafo

Limitless High-tech career opportunities - Expoint
Invent, design, implement, and refine our system’s core decisioning logic and models in a live production environment. Conduct in-depth research into complex fraud patterns, adversarial networks, and emerging global threats....
תיאור:

What you'll be doing:

  • Invent, design, implement, and refine our system’s core decisioning logic and models in a live production environment.
  • Conduct in-depth research into complex fraud patterns, adversarial networks, and emerging global threats.
  • Leverage rich datasets to derive actionable insights, develop new system components, and advance our feature engineering processes.
  • Develop, prototype, and automate new tools and processes to enhance the precision and scale of our systems.
  • Collaborate with a world-class team of Data Scientists, Analysts, Researchers, and Engineers to develop the next generation of Forter’s AI technology.

What you'll need:

  • Relevant experience (one of the below)
    • At least 2 years of hands-on experience in quantitative research/data science, or a related role involving production-oriented data analytics and hypothesis-led research.
    • MSc or PhD in a quantitative field (e.g., Physics, Economics, Neuroscience, Biotechnology, Computer Science, etc.).
  • Strong analytical and logical reasoning skills with a proven ability to dissect and solve highly complex problems.
  • Extensive experience working with large datasets using scripting languages (e.g., Python, R, or Matlab).
  • Excellent communication skills – ability to articulate complex technical concepts and research findings to diverse audiences.
Bonus points for:
  • Experience with SQL and big data technologies (e.g., Spark).
  • Deep familiarity with machine learning concepts and practice
  • Risk / Intelligence experience

Trust is backed by data – Forter is a recipient of over 10 workplace and innovation awards, including:

  • Great Place to Work Certification (2021, 2022, 2023, )
  • Fortune’s Best Workplaces in NYC (2022, 2023 and )
  • Forbes Cloud 100 (2021, 2022, 2023 and )
  • #3 on Fast Company’s list of “Most Innovative Finance Companies” ( )
  • Anti-Fraud Solution of the Year at the Payments Awards ( )
  • SAP Pinnacle Awards “New Partner Application Award” (2023)
  • Fintech Breakthrough Awards – Best Fraud Prevention Platform (2023)
Show more

משרות נוספות שיכולות לעניין אותך

23.11.2025
U

Unity Staff Data AI Engineer Israel, Tel-Aviv District, Tel-Aviv

Limitless High-tech career opportunities - Expoint
Develop and execute AI strategies aligned with business objectives. Advise leadership on AI capabilities and potential applications. Guide teams in adopting AI tools and methodologies. Ensure ethical and efficient implementation...
תיאור:
SuperSonic is hiring a Staff Data AI Lead to lead our AI initiatives.As a Staff Data AI Lead you will be responsible for leading SuperSonic organization AI integration efforts and serving as the bridge between advanced AI technologies and our business needs, implementing cutting-edge Data & AI technologies, creating AI-driven strategies and integrating innovative AI solutions across multiple platforms.
What you'll be doing
  • Develop and execute AI strategies aligned with business objectives
  • Advise leadership on AI capabilities and potential applications
  • Guide teams in adopting AI tools and methodologies
  • Ensure ethical and efficient implementation of AI technologies
  • Design and oversee AI-driven process improvements
  • Collaborate with various departments to identify AI opportunities
  • Stay current with the latest AI trends and advancements
  • Conduct AI-related training and workshops for staff
  • Manage AI projects from conception to implementation
  • Evaluate and recommend AI tools and platforms
  • Leading a team of AI engineers
What we're looking for
  • Deep understanding of AI technologies, including large language models
  • Expertise in prompt engineering and AI-powered automation
  • Proficiency with AI tools such as ChatGPT, Claude, Midjourney, and Copilot
  • Knowledge of AI ethics and regulatory considerations
  • Strong problem-solving and analytical skills
  • Proficiency with Python or TypeScript for building AI workflows and data pipelines
  • Excellent communication and leadership abilities
  • Ability to translate complex AI concepts for non-technical audiences
  • Experience in project management and cross-functional collaboration
You might also have
  • Advanced degree in Computer Science, AI, or related field
  • Previous experience in AI implementation within an organizational setting
  • Certifications in relevant AI technologies or platforms
  • Familiarity with no-code AI application development
  • Bachelor's degree in Computer Science, AI, or related field
Additional information
  • Relocation support is not available for this position.
  • Work visa/immigration sponsorship is not available for this position

This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.

Show more

משרות נוספות שיכולות לעניין אותך

22.11.2025
U

Unity Data Platform Engineering Lead Israel, Tel-Aviv District, Tel-Aviv

Limitless High-tech career opportunities - Expoint
Leading a senior Data Platform team: setting clear objectives, unblocking execution, and raising the engineering bar. Owning SLOs, on-call, incident response, and postmortems for core data services. Designing and operating...
תיאור:

Unify online/offline for features: Drive Flink adoption and patterns that keep features consistent and low-latency for experimentation and production.

Make self-serve real: Build golden paths, templates, and guardrails so product/analytics/DS engineers can move fast safely.

Run multi-tenant compute efficiently: EMR on EKS powered by Karpenter on Spot instances; right-size Trino/Spark/Druid for performance and cost.

Cross-cloud interoperability: BigQuery + BigLake/Iceberg interop where it makes sense (analytics, experimentation, partnership).

What you'll be doing
  • Leading a senior Data Platform team: setting clear objectives, unblocking execution, and raising the engineering bar.
  • Owning SLOs, on-call, incident response, and postmortems for core data services.
  • Designing and operating EMR on EKS capacity profiles, autoscaling policies, and multi-tenant isolation.
  • Tuning Trino (memory/spill, CBO, catalogs), Spark/Structured Streaming jobs, and Druid ingestion/compaction for sub-second analytics.
  • Extending Flink patterns for the feature platform (state backends, checkpointing, watermarks, backfills).
  • Driving FinOps work: CUR-based attribution, S3 Inventory-driven retention/compaction, Reservations/Savings Plans strategy, OpenCost visibility.
  • Partnering with product engineering, analytics, and data science & ML engineers on roadmap, schema evolution, and data product SLAs.
  • Leveling up observability (Prometheus/VictoriaMetrics/Grafana), data quality checks, and platform self-service tooling.
What we're looking for
  • 2+ years leading engineers (team lead or manager) building/operating large-scale data platforms; 5+ years total in Data Infrastructure/DataOps roles.
  • Proven ownership of cloud-native data platforms on AWS: S3, EMR (preferably EMR on EKS), IAM, Glue/Data Catalog, Athena.
  • Production experience with Apache Iceberg (schema evolution, compaction, retention, metadata ops) and columnar formats (Parquet/Avro).
  • Hands-on depth in at least two of: Trino/Presto, Apache Spark/Structured Streaming, Apache Druid, Apache Flink.
  • Strong conceptual understanding of Kubernetes (EKS), including autoscaling, isolation, quotas, and observability
  • Strong SQL skills and extensive experience with performance tuning, with solid proficiency in Python/Java.
  • Solid understanding of Kafka concepts, hands-on experience is a plus
  • Experience running on-call for data platforms and driving measurable SLO-based improvements.
You might also have
  • Experience building feature platforms (feature definitions, materialization, serving, and online/offline consistency).
  • Airflow (or similar) at scale; Argo experience is a plus.
  • Familiarity with BigQuery (and ideally BigLake/Iceberg interop) and operational DBs like Aurora MySQL.
  • Experience with Clickhouse / Snowflake / Databricks / Starrocks.
  • FinOps background (cost attribution/showback, Spot strategies).
  • Data quality, lineage, and cataloging practices in large orgs.
  • IaC (Terraform/CloudFormation)
Additional information
  • Relocation support is not available for this position.
  • Work visa/immigration sponsorship is not available for this position

This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.

Show more

משרות נוספות שיכולות לעניין אותך

19.11.2025
B

Bezeqint תומכ/ת DATA NETWORK Israel

Limitless High-tech career opportunities - Expoint
היכרות וניסיון בעולמות התקשורת, לרבות פרוטוקולי תקשורת BGP) WAN) ומודל שבע השכבות- חובה. היכרות עם ציודי סיסקו וג'וניפר - ברמת קינפוג, שירות - חובה. שליטה גבוהה בשפה האנגלית כתיבה ודיבור...
תיאור:

תיאור המשרה

בזק בינלאומי TECH מגייסת איש/ת תקשורת- תומכ/ת DATA & NETWORKING .

התומכ/ת עובד/ת בצוות המעניק שירות טכני - הפעלת קישורים בינלאומיים ומתן תמיכה טכנית עבור שירותים אלה, ללקוחות הישראליים והגלובאליים של החברה.

בשל העבודה מול לקוחות גלובאליים נדרשת רמה גבוהה באנגלית- ברמת הבנה, אנגלית טכנית, ויכולת ביטוי בע"פ ובכתב.

המטרה היא לספק תמיכה טכנית בעולמות תקשורת הנתונים, ולאפשר ללקוחות אלה קישוריות (זרימת תעבורה תקינה)

בקווי תקשורת ייעודיים שהקמנו עבורם. התמיכה ניתנת מקצה לקצה- הן ברמת הפעלה והקמה של פרויקטים אלו, והן ברמה של תחזוקה, פתרון תקלות (כגון נפילת קווים, LATENCY, PACKET LOSS ועוד) ומתן שירות שוטף.

השירות ללקוחות ניתן דרך מענה מרחוק, ונדרשת יכולת עבודה מול ממשקים מגוונים- פנימיים וחיצוניים- כדוגמת גורמי מכירה בתוך האגף, מנהלי פרויקטים, מפעילים זרים בחול (כגון BT, AT&T, Verizon ועוד).

שעות פעילות הצוות הן 8:00-19:00

דרישות

  • היכרות וניסיון בעולמות התקשורת, לרבות פרוטוקולי תקשורת BGP) WAN) ומודל שבע השכבות- חובה
  • היכרות עם ציודי סיסקו וג'וניפר - ברמת קינפוג, שירות - חובה
  • שליטה גבוהה בשפה האנגלית כתיבה ודיבור - חובה
  • נכונות לביצוע כוננויות (בעיקר דרך השתלטות מרחוק)- חובה
  • שתף:
Show more

משרות נוספות שיכולות לעניין אותך

19.11.2025
PP

PayPal Sr Data Scientist Israel, Tel Aviv District, Tel Aviv-Yafo

Limitless High-tech career opportunities - Expoint
Lead the development and implementation of advanced data science models. Collaborate with stakeholders to understand requirements. Drive best practices in data science. Ensure data quality and integrity in all processes....
תיאור:


2. This role is open to support the expansion of Bot Mitigation solutions, and would be part of our global team.
3. The role requires some flexibility in work-hours and due to team’s structure requires to accommodate online meetings up to two evenings per week.

Essential Responsibilities:

  • Lead the development and implementation of advanced data science models.
  • Collaborate with stakeholders to understand requirements.
  • Drive best practices in data science.
  • Ensure data quality and integrity in all processes.
  • Mentor and guide junior data scientists.
  • Stay updated with the latest trends in data science.

Expected Qualifications:

  • 3+ years relevant experience and a Bachelor’s degree OR Any equivalent combination of education and experience.

In your role as a Decision Scientist, you will:

· Trackand measure performance against KPIs and goals to identify and mitigate fraud risk and enable growth

· Plan,drive and execute projects from start to finish, with partners across the company, to develop cutting edge, scalable and safe solutions

· Workwith product and platform teams to develop cutting edge, scalable and safe products, to enhance the experience for our global customers

· Growongoing communication with partners across the company and share updates with senior leaders effectively and while translating complex problems into simpler terms

· Planand measure experiments to analyse incremental value of new intelligence feeds, from internal teams or vendors


Our Benefits:

Any general requests for consideration of your skills, please

Show more

משרות נוספות שיכולות לעניין אותך

Limitless High-tech career opportunities - Expoint
Technical Leadership & Architecture: Drive data infrastructure strategy and establish standardized patterns for AI/ML workloads, with direct influence on architectural decisions across data and engineering teams. DataOps Excellence: Create seamless...
תיאור:
The opportunity
  • Technical Leadership & Architecture: Drive data infrastructure strategy and establish standardized patterns for AI/ML workloads, with direct influence on architectural decisions across data and engineering teams
  • DataOps Excellence: Create seamless developer experience through self-service capabilities while significantly improving data engineer productivity and pipeline reliability metrics
  • Cross-Functional Innovation: Lead collaboration between DevOps, Data Engineering, and ML Operations teams to unify our approach to infrastructure as code and orchestration platforms
  • Technology Breadth & Growth: Work across the full DataOps spectrum from pipeline orchestration to AI/ML infrastructure, with clear advancement opportunities as a senior infrastructure engineer
  • Strategic Business Impact: Build scalable analytics capabilities that provide direct line of sight between your infrastructure work and business outcomes through reliable, cutting-edge data solutions
What you'll be doing
  • Design Data-Native Cloud Solutions - Design and implement scalable data infrastructure across multiple environments using Kubernetes, orchestration platforms, and IaC to power our AI, ML, and analytics ecosystem
  • Define DataOps Technical Strategy - Shape the technical vision and roadmap for our data infrastructure capabilities, aligning DevOps, Data Engineering, and ML teams around common patterns and practices
  • Accelerate Data Engineer Experience - Spearhead improvements to data pipeline deployment, monitoring tools, and self-service capabilities that empower data teams to deliver insights faster with higher reliability
  • Engineer Robust Data Platforms - Build and optimize infrastructure that supports diverse data workloads from real-time streaming to batch processing, ensuring performance and cost-effectiveness for critical analytics systems
  • Drive DataOps Excellence - Collaborate with engineering leaders across data teams, champion modern infrastructure practices, and mentor team members to elevate how we build, deploy, and operate data systems at scale
What we're looking for
  • 3+ years of hands-on DevOps experience building, shipping, and operating production systems.
  • Coding proficiency in at least one language (e.g., Python or TypeScript); able to build production-grade automation and tools.
  • Cloud platforms: deep experience with AWS, GCP, or Azure (core services, networking, IAM).
  • Kubernetes: strong end-to-end understanding of Kubernetes as a system (routing/networking, scaling, security, observability, upgrades), with proven experience integrating data-centric components (e.g., Kafka, RDS, BigQuery, Aerospike).
  • Infrastructure as Code: design and implement infrastructure automation using tools such as Terraform, Pulumi, or CloudFormation (modular code, reusable patterns, pipeline integration).
  • GitOps & CI/CD: practical experience implementing pipelines and advanced delivery using tools such as Argo CD / Argo Rollouts, GitHub Actions, or similar.
  • Observability: metrics, logs, and traces; actionable alerting and SLOs using tools such as Prometheus, Grafana, ELK/EFK, OpenTelemetry, or similar.
You might also have
  • Data Pipeline Orchestration - Demonstrated success building and optimizing data pipeline deployment using modern tools (Airflow, Prefect, Kubernetes operators) and implementing GitOps practices for data workloads
  • Data Engineer Experience Focus - Track record of creating and improving self-service platforms, deployment tools, and monitoring solutions that measurably enhance data engineering team productivity
  • Data Infrastructure Deep Knowledge - Extensive experience designing infrastructure for data-intensive workloads including streaming platforms (Kafka, Kinesis), data processing frameworks (Spark, Flink), storage solutions, and comprehensive observability systems
Additional information
  • Relocation support is not available for this position.
  • Work visa/immigration sponsorship is not available for this position

This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.

Show more
מדען נתונים ממלא תפקיד מכריע בהפקת תובנות חשובות ממערכי נתונים גדולים ומורכבים. Data Scientist אחראים על תכנון וביצוע ניסויים, פיתוח מודלים חזויים ואספקת תוצאות ניתנות לפעולה בכדי להביא תמיכה בקבלת החלטות מונעות נתונים. בנוסף, דאטה סיינטיסט אחראים לניתוח ופרשנות של מערכי נתונים גדולים ומורכבים במטרה לחשוף תובנות המאפשרות החלטות עסקיות שמניעות חדשנות. Data Scientist משתמשים בשילוב של טכניקות סטטיסטיות ולמידת מכונה, כמו גם בשפות תכנות כמו Python ו-R, כדי לחלץ, לנקות ולהמיר נתונים. הם גם משתמשים בכלים כגון SQL, Hadoop ו-Spark כדי לעבוד עם מערכי נתונים גדולים ולבצע ניתוח נתונים. דאטה סיינטיסט עובדים בשיתוף פעולה הדוק עם צוותים מגוונים, כגון מנהלי מוצר, שיווק ותפעול, כדי להבין את צרכי הנתונים הספציפיים שלהם ולספק להם תובנות כדי שאלה יוכלו לקבל החלטות מושכלות. הם גם משתפים פעולה עם מהנדסי נתונים כדי לבנות ולתחזק את התשתית הדרושה לניתוח נתונים. הם גם משתמשים בכלים להדמיה של נתונים כגון Tableau ו-Power BI כדי ליצור לוחות מחוונים ודוחות אינטראקטיביים כדי להעביר תובנות נתונים בצורה ברורה ויעילה לבעלי עניין שאינם טכניים. מדען נתונים צריך להיות בעל רקע טכני חזק עם ניסיון בשפות תכנות כגון Python או R, ניסיון בכלי הדמיית נתונים ודיווח, ובעל הבנה עמוקה של אלגוריתמים של ניתוח סטטיסטי ולמידת מכונה. בנוסף, מיומנויות רכות כגון תקשורת טובה, יכולת שיתוף פעולה ויכולות פתרון בעיות הן המפתח להצלחה כמדען נתונים, מכיוון שלעתים קרובות הם פועלים באופן צולב עם בעלי עניין ממחלקות שונות כדי להניע יוזמות מבוססות נתונים. התרומה של Data scientist לכל מקום עבודה היא משמעותית מאוד, שכן הם עוזרים לארגונים למנף נתונים כדי לקבל החלטות מושכלות, לשפר תהליכים ולהניע צמיחה. יום טיפוסי עבור מדען נתונים כולל איסוף וניקוי נתונים, תכנון ניסויים, בניית מודלים והצגת תובנות לבעלי עניין. בישראל, טווח השכר החודשי של מדען נתונים נע בין 20,000 ל-40,000 ₪. תפקיד זה אידיאלי עבור אנשים שיש להם תשוקה הן לטכנולוגיה והן לנתונים, ובעלי מוטיבציה למצוא תובנות נסתרות שיכולות להניב ערך עסקי. באקספוינט אנחנו מחברים בין כישרונות מובילים לבין חברות מובילות, ומציעים מגוון רחב של משרות data scientist, במיוחד למדענים בעלי ניסיון שמחפשים לקדם את הקריירה שלהם. דפדפו בין מודעות דרושים data scientist ומצאו את המשרה הבאה שלכם.