Expoint – all jobs in one place
The point where experts and best companies meet

Duality-data Scientist jobs in Israel

Unlock your potential in the high tech industry with Expoint. Search for job opportunities as a Duality-data Scientist in Israel and join the network of leading companies. Start your journey today and find your dream job as a Duality-data Scientist with Expoint.
Company
Job type
Job categories
Job title (1)
Israel
City
352 jobs found
Yesterday
U

Unity Senior DevOps Engineer Data Platform Israel, Tel-Aviv District, Tel-Aviv

Limitless High-tech career opportunities - Expoint
Technical Leadership & Architecture: Drive data infrastructure strategy and establish standardized patterns for AI/ML workloads, with direct influence on architectural decisions across data and engineering teams. DataOps Excellence: Create seamless...
Description:
The opportunity
  • Technical Leadership & Architecture: Drive data infrastructure strategy and establish standardized patterns for AI/ML workloads, with direct influence on architectural decisions across data and engineering teams
  • DataOps Excellence: Create seamless developer experience through self-service capabilities while significantly improving data engineer productivity and pipeline reliability metrics
  • Cross-Functional Innovation: Lead collaboration between DevOps, Data Engineering, and ML Operations teams to unify our approach to infrastructure as code and orchestration platforms
  • Technology Breadth & Growth: Work across the full DataOps spectrum from pipeline orchestration to AI/ML infrastructure, with clear advancement opportunities as a senior infrastructure engineer
  • Strategic Business Impact: Build scalable analytics capabilities that provide direct line of sight between your infrastructure work and business outcomes through reliable, cutting-edge data solutions
What you'll be doing
  • Design Data-Native Cloud Solutions - Design and implement scalable data infrastructure across multiple environments using Kubernetes, orchestration platforms, and IaC to power our AI, ML, and analytics ecosystem
  • Define DataOps Technical Strategy - Shape the technical vision and roadmap for our data infrastructure capabilities, aligning DevOps, Data Engineering, and ML teams around common patterns and practices
  • Accelerate Data Engineer Experience - Spearhead improvements to data pipeline deployment, monitoring tools, and self-service capabilities that empower data teams to deliver insights faster with higher reliability
  • Engineer Robust Data Platforms - Build and optimize infrastructure that supports diverse data workloads from real-time streaming to batch processing, ensuring performance and cost-effectiveness for critical analytics systems
  • Drive DataOps Excellence - Collaborate with engineering leaders across data teams, champion modern infrastructure practices, and mentor team members to elevate how we build, deploy, and operate data systems at scale
What we're looking for
  • 3+ years of hands-on DevOps experience building, shipping, and operating production systems.
  • Coding proficiency in at least one language (e.g., Python or TypeScript); able to build production-grade automation and tools.
  • Cloud platforms: deep experience with AWS, GCP, or Azure (core services, networking, IAM).
  • Kubernetes: strong end-to-end understanding of Kubernetes as a system (routing/networking, scaling, security, observability, upgrades), with proven experience integrating data-centric components (e.g., Kafka, RDS, BigQuery, Aerospike).
  • Infrastructure as Code: design and implement infrastructure automation using tools such as Terraform, Pulumi, or CloudFormation (modular code, reusable patterns, pipeline integration).
  • GitOps & CI/CD: practical experience implementing pipelines and advanced delivery using tools such as Argo CD / Argo Rollouts, GitHub Actions, or similar.
  • Observability: metrics, logs, and traces; actionable alerting and SLOs using tools such as Prometheus, Grafana, ELK/EFK, OpenTelemetry, or similar.
You might also have
  • Data Pipeline Orchestration - Demonstrated success building and optimizing data pipeline deployment using modern tools (Airflow, Prefect, Kubernetes operators) and implementing GitOps practices for data workloads
  • Data Engineer Experience Focus - Track record of creating and improving self-service platforms, deployment tools, and monitoring solutions that measurably enhance data engineering team productivity
  • Data Infrastructure Deep Knowledge - Extensive experience designing infrastructure for data-intensive workloads including streaming platforms (Kafka, Kinesis), data processing frameworks (Spark, Flink), storage solutions, and comprehensive observability systems
Additional information
  • Relocation support is not available for this position.
  • Work visa/immigration sponsorship is not available for this position

This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.

Show more
Yesterday
F

Forter Data ResearcherNew Israel, Tel Aviv District, Tel Aviv-Yafo

Limitless High-tech career opportunities - Expoint
Invent, design, implement, and refine our system’s core decisioning logic and models in a live production environment. Conduct in-depth research into complex fraud patterns, adversarial networks, and emerging global threats....
Description:

What you'll be doing:

  • Invent, design, implement, and refine our system’s core decisioning logic and models in a live production environment.
  • Conduct in-depth research into complex fraud patterns, adversarial networks, and emerging global threats.
  • Leverage rich datasets to derive actionable insights, develop new system components, and advance our feature engineering processes.
  • Develop, prototype, and automate new tools and processes to enhance the precision and scale of our systems.
  • Collaborate with a world-class team of Data Scientists, Analysts, Researchers, and Engineers to develop the next generation of Forter’s AI technology.

What you'll need:

  • Relevant experience (one of the below)
    • At least 2 years of hands-on experience in quantitative research/data science, or a related role involving production-oriented data analytics and hypothesis-led research.
    • MSc or PhD in a quantitative field (e.g., Physics, Economics, Neuroscience, Biotechnology, Computer Science, etc.).
  • Strong analytical and logical reasoning skills with a proven ability to dissect and solve highly complex problems.
  • Extensive experience working with large datasets using scripting languages (e.g., Python, R, or Matlab).
  • Excellent communication skills – ability to articulate complex technical concepts and research findings to diverse audiences.
Bonus points for:
  • Experience with SQL and big data technologies (e.g., Spark).
  • Deep familiarity with machine learning concepts and practice
  • Risk / Intelligence experience

Trust is backed by data – Forter is a recipient of over 10 workplace and innovation awards, including:

  • Great Place to Work Certification (2021, 2022, 2023, )
  • Fortune’s Best Workplaces in NYC (2022, 2023 and )
  • Forbes Cloud 100 (2021, 2022, 2023 and )
  • #3 on Fast Company’s list of “Most Innovative Finance Companies” ( )
  • Anti-Fraud Solution of the Year at the Payments Awards ( )
  • SAP Pinnacle Awards “New Partner Application Award” (2023)
  • Fintech Breakthrough Awards – Best Fraud Prevention Platform (2023)
Show more

These jobs might be a good fit

Yesterday
U

Unity Staff Data AI Engineer Israel, Tel-Aviv District, Tel-Aviv

Limitless High-tech career opportunities - Expoint
Develop and execute AI strategies aligned with business objectives. Advise leadership on AI capabilities and potential applications. Guide teams in adopting AI tools and methodologies. Ensure ethical and efficient implementation...
Description:
SuperSonic is hiring a Staff Data AI Lead to lead our AI initiatives.As a Staff Data AI Lead you will be responsible for leading SuperSonic organization AI integration efforts and serving as the bridge between advanced AI technologies and our business needs, implementing cutting-edge Data & AI technologies, creating AI-driven strategies and integrating innovative AI solutions across multiple platforms.
What you'll be doing
  • Develop and execute AI strategies aligned with business objectives
  • Advise leadership on AI capabilities and potential applications
  • Guide teams in adopting AI tools and methodologies
  • Ensure ethical and efficient implementation of AI technologies
  • Design and oversee AI-driven process improvements
  • Collaborate with various departments to identify AI opportunities
  • Stay current with the latest AI trends and advancements
  • Conduct AI-related training and workshops for staff
  • Manage AI projects from conception to implementation
  • Evaluate and recommend AI tools and platforms
  • Leading a team of AI engineers
What we're looking for
  • Deep understanding of AI technologies, including large language models
  • Expertise in prompt engineering and AI-powered automation
  • Proficiency with AI tools such as ChatGPT, Claude, Midjourney, and Copilot
  • Knowledge of AI ethics and regulatory considerations
  • Strong problem-solving and analytical skills
  • Proficiency with Python or TypeScript for building AI workflows and data pipelines
  • Excellent communication and leadership abilities
  • Ability to translate complex AI concepts for non-technical audiences
  • Experience in project management and cross-functional collaboration
You might also have
  • Advanced degree in Computer Science, AI, or related field
  • Previous experience in AI implementation within an organizational setting
  • Certifications in relevant AI technologies or platforms
  • Familiarity with no-code AI application development
  • Bachelor's degree in Computer Science, AI, or related field
Additional information
  • Relocation support is not available for this position.
  • Work visa/immigration sponsorship is not available for this position

This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.

Show more

These jobs might be a good fit

22.11.2025
U

Unity Data Platform Engineering Lead Israel, Tel-Aviv District, Tel-Aviv

Limitless High-tech career opportunities - Expoint
Leading a senior Data Platform team: setting clear objectives, unblocking execution, and raising the engineering bar. Owning SLOs, on-call, incident response, and postmortems for core data services. Designing and operating...
Description:

Unify online/offline for features: Drive Flink adoption and patterns that keep features consistent and low-latency for experimentation and production.

Make self-serve real: Build golden paths, templates, and guardrails so product/analytics/DS engineers can move fast safely.

Run multi-tenant compute efficiently: EMR on EKS powered by Karpenter on Spot instances; right-size Trino/Spark/Druid for performance and cost.

Cross-cloud interoperability: BigQuery + BigLake/Iceberg interop where it makes sense (analytics, experimentation, partnership).

What you'll be doing
  • Leading a senior Data Platform team: setting clear objectives, unblocking execution, and raising the engineering bar.
  • Owning SLOs, on-call, incident response, and postmortems for core data services.
  • Designing and operating EMR on EKS capacity profiles, autoscaling policies, and multi-tenant isolation.
  • Tuning Trino (memory/spill, CBO, catalogs), Spark/Structured Streaming jobs, and Druid ingestion/compaction for sub-second analytics.
  • Extending Flink patterns for the feature platform (state backends, checkpointing, watermarks, backfills).
  • Driving FinOps work: CUR-based attribution, S3 Inventory-driven retention/compaction, Reservations/Savings Plans strategy, OpenCost visibility.
  • Partnering with product engineering, analytics, and data science & ML engineers on roadmap, schema evolution, and data product SLAs.
  • Leveling up observability (Prometheus/VictoriaMetrics/Grafana), data quality checks, and platform self-service tooling.
What we're looking for
  • 2+ years leading engineers (team lead or manager) building/operating large-scale data platforms; 5+ years total in Data Infrastructure/DataOps roles.
  • Proven ownership of cloud-native data platforms on AWS: S3, EMR (preferably EMR on EKS), IAM, Glue/Data Catalog, Athena.
  • Production experience with Apache Iceberg (schema evolution, compaction, retention, metadata ops) and columnar formats (Parquet/Avro).
  • Hands-on depth in at least two of: Trino/Presto, Apache Spark/Structured Streaming, Apache Druid, Apache Flink.
  • Strong conceptual understanding of Kubernetes (EKS), including autoscaling, isolation, quotas, and observability
  • Strong SQL skills and extensive experience with performance tuning, with solid proficiency in Python/Java.
  • Solid understanding of Kafka concepts, hands-on experience is a plus
  • Experience running on-call for data platforms and driving measurable SLO-based improvements.
You might also have
  • Experience building feature platforms (feature definitions, materialization, serving, and online/offline consistency).
  • Airflow (or similar) at scale; Argo experience is a plus.
  • Familiarity with BigQuery (and ideally BigLake/Iceberg interop) and operational DBs like Aurora MySQL.
  • Experience with Clickhouse / Snowflake / Databricks / Starrocks.
  • FinOps background (cost attribution/showback, Spot strategies).
  • Data quality, lineage, and cataloging practices in large orgs.
  • IaC (Terraform/CloudFormation)
Additional information
  • Relocation support is not available for this position.
  • Work visa/immigration sponsorship is not available for this position

This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.

Show more

These jobs might be a good fit

19.11.2025
B

Bezeqint תומכ/ת DATA NETWORK Israel

Limitless High-tech career opportunities - Expoint
היכרות וניסיון בעולמות התקשורת, לרבות פרוטוקולי תקשורת BGP) WAN) ומודל שבע השכבות- חובה. היכרות עם ציודי סיסקו וג'וניפר - ברמת קינפוג, שירות - חובה. שליטה גבוהה בשפה האנגלית כתיבה ודיבור...
Description:

תיאור המשרה

בזק בינלאומי TECH מגייסת איש/ת תקשורת- תומכ/ת DATA & NETWORKING .

התומכ/ת עובד/ת בצוות המעניק שירות טכני - הפעלת קישורים בינלאומיים ומתן תמיכה טכנית עבור שירותים אלה, ללקוחות הישראליים והגלובאליים של החברה.

בשל העבודה מול לקוחות גלובאליים נדרשת רמה גבוהה באנגלית- ברמת הבנה, אנגלית טכנית, ויכולת ביטוי בע"פ ובכתב.

המטרה היא לספק תמיכה טכנית בעולמות תקשורת הנתונים, ולאפשר ללקוחות אלה קישוריות (זרימת תעבורה תקינה)

בקווי תקשורת ייעודיים שהקמנו עבורם. התמיכה ניתנת מקצה לקצה- הן ברמת הפעלה והקמה של פרויקטים אלו, והן ברמה של תחזוקה, פתרון תקלות (כגון נפילת קווים, LATENCY, PACKET LOSS ועוד) ומתן שירות שוטף.

השירות ללקוחות ניתן דרך מענה מרחוק, ונדרשת יכולת עבודה מול ממשקים מגוונים- פנימיים וחיצוניים- כדוגמת גורמי מכירה בתוך האגף, מנהלי פרויקטים, מפעילים זרים בחול (כגון BT, AT&T, Verizon ועוד).

שעות פעילות הצוות הן 8:00-19:00

דרישות

  • היכרות וניסיון בעולמות התקשורת, לרבות פרוטוקולי תקשורת BGP) WAN) ומודל שבע השכבות- חובה
  • היכרות עם ציודי סיסקו וג'וניפר - ברמת קינפוג, שירות - חובה
  • שליטה גבוהה בשפה האנגלית כתיבה ודיבור - חובה
  • נכונות לביצוע כוננויות (בעיקר דרך השתלטות מרחוק)- חובה
  • שתף:
Show more

These jobs might be a good fit

19.11.2025
PP

PayPal Sr Data Scientist Israel, Tel Aviv District, Tel Aviv-Yafo

Limitless High-tech career opportunities - Expoint
Lead the development and implementation of advanced data science models. Collaborate with stakeholders to understand requirements. Drive best practices in data science. Ensure data quality and integrity in all processes....
Description:


2. This role is open to support the expansion of Bot Mitigation solutions, and would be part of our global team.
3. The role requires some flexibility in work-hours and due to team’s structure requires to accommodate online meetings up to two evenings per week.

Essential Responsibilities:

  • Lead the development and implementation of advanced data science models.
  • Collaborate with stakeholders to understand requirements.
  • Drive best practices in data science.
  • Ensure data quality and integrity in all processes.
  • Mentor and guide junior data scientists.
  • Stay updated with the latest trends in data science.

Expected Qualifications:

  • 3+ years relevant experience and a Bachelor’s degree OR Any equivalent combination of education and experience.

In your role as a Decision Scientist, you will:

· Trackand measure performance against KPIs and goals to identify and mitigate fraud risk and enable growth

· Plan,drive and execute projects from start to finish, with partners across the company, to develop cutting edge, scalable and safe solutions

· Workwith product and platform teams to develop cutting edge, scalable and safe products, to enhance the experience for our global customers

· Growongoing communication with partners across the company and share updates with senior leaders effectively and while translating complex problems into simpler terms

· Planand measure experiments to analyse incremental value of new intelligence feeds, from internal teams or vendors


Our Benefits:

Any general requests for consideration of your skills, please

Show more

These jobs might be a good fit

Limitless High-tech career opportunities - Expoint
Technical Leadership & Architecture: Drive data infrastructure strategy and establish standardized patterns for AI/ML workloads, with direct influence on architectural decisions across data and engineering teams. DataOps Excellence: Create seamless...
Description:
The opportunity
  • Technical Leadership & Architecture: Drive data infrastructure strategy and establish standardized patterns for AI/ML workloads, with direct influence on architectural decisions across data and engineering teams
  • DataOps Excellence: Create seamless developer experience through self-service capabilities while significantly improving data engineer productivity and pipeline reliability metrics
  • Cross-Functional Innovation: Lead collaboration between DevOps, Data Engineering, and ML Operations teams to unify our approach to infrastructure as code and orchestration platforms
  • Technology Breadth & Growth: Work across the full DataOps spectrum from pipeline orchestration to AI/ML infrastructure, with clear advancement opportunities as a senior infrastructure engineer
  • Strategic Business Impact: Build scalable analytics capabilities that provide direct line of sight between your infrastructure work and business outcomes through reliable, cutting-edge data solutions
What you'll be doing
  • Design Data-Native Cloud Solutions - Design and implement scalable data infrastructure across multiple environments using Kubernetes, orchestration platforms, and IaC to power our AI, ML, and analytics ecosystem
  • Define DataOps Technical Strategy - Shape the technical vision and roadmap for our data infrastructure capabilities, aligning DevOps, Data Engineering, and ML teams around common patterns and practices
  • Accelerate Data Engineer Experience - Spearhead improvements to data pipeline deployment, monitoring tools, and self-service capabilities that empower data teams to deliver insights faster with higher reliability
  • Engineer Robust Data Platforms - Build and optimize infrastructure that supports diverse data workloads from real-time streaming to batch processing, ensuring performance and cost-effectiveness for critical analytics systems
  • Drive DataOps Excellence - Collaborate with engineering leaders across data teams, champion modern infrastructure practices, and mentor team members to elevate how we build, deploy, and operate data systems at scale
What we're looking for
  • 3+ years of hands-on DevOps experience building, shipping, and operating production systems.
  • Coding proficiency in at least one language (e.g., Python or TypeScript); able to build production-grade automation and tools.
  • Cloud platforms: deep experience with AWS, GCP, or Azure (core services, networking, IAM).
  • Kubernetes: strong end-to-end understanding of Kubernetes as a system (routing/networking, scaling, security, observability, upgrades), with proven experience integrating data-centric components (e.g., Kafka, RDS, BigQuery, Aerospike).
  • Infrastructure as Code: design and implement infrastructure automation using tools such as Terraform, Pulumi, or CloudFormation (modular code, reusable patterns, pipeline integration).
  • GitOps & CI/CD: practical experience implementing pipelines and advanced delivery using tools such as Argo CD / Argo Rollouts, GitHub Actions, or similar.
  • Observability: metrics, logs, and traces; actionable alerting and SLOs using tools such as Prometheus, Grafana, ELK/EFK, OpenTelemetry, or similar.
You might also have
  • Data Pipeline Orchestration - Demonstrated success building and optimizing data pipeline deployment using modern tools (Airflow, Prefect, Kubernetes operators) and implementing GitOps practices for data workloads
  • Data Engineer Experience Focus - Track record of creating and improving self-service platforms, deployment tools, and monitoring solutions that measurably enhance data engineering team productivity
  • Data Infrastructure Deep Knowledge - Extensive experience designing infrastructure for data-intensive workloads including streaming platforms (Kafka, Kinesis), data processing frameworks (Spark, Flink), storage solutions, and comprehensive observability systems
Additional information
  • Relocation support is not available for this position.
  • Work visa/immigration sponsorship is not available for this position

This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.

Show more
Are you ready to make an impact in the high-demanded tech world? Are you a fast learner that is always looking for the next challenge? If yes, consider a career in the booming tech industry as a duality-data scientist in Israel! Expoint is a job searching portal solely for the tech industry that offers exclusive access to this highly competitive field. As a duality-data scientist in Israel, your primary duty is to construct accurate, sophisticated, and intuitive machine learning models that can identify patterns, provide analysis, and draw meaningful conclusions from a large stream of data. You will also build predictive models and process natural language data to classify and label multiple documents. Moreover, you will explore and refine datasets, discover the patterns, validations, and insights in the data. You get the chance to work with massive volumes of structured and unstructured data from multiple sources, while asking the right questions and finding solutions to specific problems. Your analytical skillset, enhanced by your innovative and proactive approach, will enable you to create compelling data-driven stories with meaningful insights. Above all, you will be working in a truly stimulating environment where collaboration is a key factor. Driven by curiosity, with a strong sense of data-driven impact and “born to win” attitude, you will further optimize both your skillset and the product you’re helping to innovate. Do you have what it takes to exceed the expectations in this highly dynamic industry? Take a shot at a career in the duality-data science in Israel and become part of the driving force behind future advancements in this highly rewarding field. Start your journey with Expoint today!