Expoint – all jobs in one place
Finding the best job has never been easier

Applied Data Scientist jobs in Israel

Unlock your potential in the high tech industry with Expoint. Search for job opportunities as a Applied Data Scientist in Israel and join the network of leading companies. Start your journey today and find your dream job as a Applied Data Scientist with Expoint.
Company
Job type
Job categories
Job title (1)
Israel
City
352 jobs found
Yesterday
D

Dell Data Center Sales Executive Israel, Tel-Aviv District

08.12.2025
PA

Palo Alto Senior Manager Software Engineering - Data Security Cortex C... Israel, Tel Aviv District, Tel Aviv-Yafo

Limitless High-tech career opportunities - Expoint
Lead, mentor, and grow multiple engineering teams and managers within the DSPM domain. Own the full SDLC - from translating product strategy into actionable roadmaps to ensuring flawless execution and...
Description:

Being the cybersecurity partner of choice, protecting our digital way of life.

Your Impact

  • Lead, mentor, and grow multiple engineering teams and managers within the DSPM domain.
  • Own the full SDLC - from translating product strategy into actionable roadmaps to ensuring flawless execution and on-time delivery.
  • Engage directly with strategic customers to lead technical deep-dives, architecture reviews, and roadmap discussions.
  • Set high engineering standards for quality and security while building a culture of accountability and continuous improvement.
  • Build a culture of accountability, ownership, and continuous improvement.
  • Oversee the architecture of scalable, distributed systems (primarily Python & Go) capable of processing data at petabyte scale.
  • Guide teams on building high-throughput pipelines and cloud-native microservices.
  • Ensure efficient deployment, observability, and runtime stability in production environments.
  • Partner closely with Product Managers and cross-functional groups (Infra, Research, UX) to define priorities and build multi-quarter roadmaps.
  • Align stakeholders across business units and communicate tradeoffs, risks, and execution plans with clarity.

Your Experience

  • 3+ years managing software engineering teams, including managing managers.
  • 5+ years of experience as a hands-on software engineer.
  • Proven track record of delivering complex, distributed cloud products end-to-end.
  • Strong systems-level background in one or more languages: Go, Python.
  • Experience with large-scale cloud architectures (GCP, AWS, or Azure).
  • Demonstrated ability to plan, execute, and deliver roadmaps with high predictability.
  • Strong collaboration skills; able to align cross-disciplinary teams around a shared goal.

Advantages

  • Experience with orchestration frameworks (Temporal, Argo Workflows, etc.).
  • Familiarity with BigQuery, MongoDB, PostgreSQL, or similar.
  • Background in cybersecurity, data security, or threat intelligence.
  • Experience running services at massive scale across distributed environments.

All your information will be kept confidential according to EEO guidelines.

Show more

These jobs might be a good fit

23.11.2025
U

Unity Senior DevOps Engineer Data Platform Israel, Tel-Aviv District, Tel-Aviv

Limitless High-tech career opportunities - Expoint
Technical Leadership & Architecture: Drive data infrastructure strategy and establish standardized patterns for AI/ML workloads, with direct influence on architectural decisions across data and engineering teams. DataOps Excellence: Create seamless...
Description:
The opportunity
  • Technical Leadership & Architecture: Drive data infrastructure strategy and establish standardized patterns for AI/ML workloads, with direct influence on architectural decisions across data and engineering teams
  • DataOps Excellence: Create seamless developer experience through self-service capabilities while significantly improving data engineer productivity and pipeline reliability metrics
  • Cross-Functional Innovation: Lead collaboration between DevOps, Data Engineering, and ML Operations teams to unify our approach to infrastructure as code and orchestration platforms
  • Technology Breadth & Growth: Work across the full DataOps spectrum from pipeline orchestration to AI/ML infrastructure, with clear advancement opportunities as a senior infrastructure engineer
  • Strategic Business Impact: Build scalable analytics capabilities that provide direct line of sight between your infrastructure work and business outcomes through reliable, cutting-edge data solutions
What you'll be doing
  • Design Data-Native Cloud Solutions - Design and implement scalable data infrastructure across multiple environments using Kubernetes, orchestration platforms, and IaC to power our AI, ML, and analytics ecosystem
  • Define DataOps Technical Strategy - Shape the technical vision and roadmap for our data infrastructure capabilities, aligning DevOps, Data Engineering, and ML teams around common patterns and practices
  • Accelerate Data Engineer Experience - Spearhead improvements to data pipeline deployment, monitoring tools, and self-service capabilities that empower data teams to deliver insights faster with higher reliability
  • Engineer Robust Data Platforms - Build and optimize infrastructure that supports diverse data workloads from real-time streaming to batch processing, ensuring performance and cost-effectiveness for critical analytics systems
  • Drive DataOps Excellence - Collaborate with engineering leaders across data teams, champion modern infrastructure practices, and mentor team members to elevate how we build, deploy, and operate data systems at scale
What we're looking for
  • 3+ years of hands-on DevOps experience building, shipping, and operating production systems.
  • Coding proficiency in at least one language (e.g., Python or TypeScript); able to build production-grade automation and tools.
  • Cloud platforms: deep experience with AWS, GCP, or Azure (core services, networking, IAM).
  • Kubernetes: strong end-to-end understanding of Kubernetes as a system (routing/networking, scaling, security, observability, upgrades), with proven experience integrating data-centric components (e.g., Kafka, RDS, BigQuery, Aerospike).
  • Infrastructure as Code: design and implement infrastructure automation using tools such as Terraform, Pulumi, or CloudFormation (modular code, reusable patterns, pipeline integration).
  • GitOps & CI/CD: practical experience implementing pipelines and advanced delivery using tools such as Argo CD / Argo Rollouts, GitHub Actions, or similar.
  • Observability: metrics, logs, and traces; actionable alerting and SLOs using tools such as Prometheus, Grafana, ELK/EFK, OpenTelemetry, or similar.
You might also have
  • Data Pipeline Orchestration - Demonstrated success building and optimizing data pipeline deployment using modern tools (Airflow, Prefect, Kubernetes operators) and implementing GitOps practices for data workloads
  • Data Engineer Experience Focus - Track record of creating and improving self-service platforms, deployment tools, and monitoring solutions that measurably enhance data engineering team productivity
  • Data Infrastructure Deep Knowledge - Extensive experience designing infrastructure for data-intensive workloads including streaming platforms (Kafka, Kinesis), data processing frameworks (Spark, Flink), storage solutions, and comprehensive observability systems
Additional information
  • Relocation support is not available for this position.
  • Work visa/immigration sponsorship is not available for this position

This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.

Show more

These jobs might be a good fit

23.11.2025
F

Forter Data ResearcherNew Israel, Tel Aviv District, Tel Aviv-Yafo

Limitless High-tech career opportunities - Expoint
Invent, design, implement, and refine our system’s core decisioning logic and models in a live production environment. Conduct in-depth research into complex fraud patterns, adversarial networks, and emerging global threats....
Description:

What you'll be doing:

  • Invent, design, implement, and refine our system’s core decisioning logic and models in a live production environment.
  • Conduct in-depth research into complex fraud patterns, adversarial networks, and emerging global threats.
  • Leverage rich datasets to derive actionable insights, develop new system components, and advance our feature engineering processes.
  • Develop, prototype, and automate new tools and processes to enhance the precision and scale of our systems.
  • Collaborate with a world-class team of Data Scientists, Analysts, Researchers, and Engineers to develop the next generation of Forter’s AI technology.

What you'll need:

  • Relevant experience (one of the below)
    • At least 2 years of hands-on experience in quantitative research/data science, or a related role involving production-oriented data analytics and hypothesis-led research.
    • MSc or PhD in a quantitative field (e.g., Physics, Economics, Neuroscience, Biotechnology, Computer Science, etc.).
  • Strong analytical and logical reasoning skills with a proven ability to dissect and solve highly complex problems.
  • Extensive experience working with large datasets using scripting languages (e.g., Python, R, or Matlab).
  • Excellent communication skills – ability to articulate complex technical concepts and research findings to diverse audiences.
Bonus points for:
  • Experience with SQL and big data technologies (e.g., Spark).
  • Deep familiarity with machine learning concepts and practice
  • Risk / Intelligence experience

Trust is backed by data – Forter is a recipient of over 10 workplace and innovation awards, including:

  • Great Place to Work Certification (2021, 2022, 2023, )
  • Fortune’s Best Workplaces in NYC (2022, 2023 and )
  • Forbes Cloud 100 (2021, 2022, 2023 and )
  • #3 on Fast Company’s list of “Most Innovative Finance Companies” ( )
  • Anti-Fraud Solution of the Year at the Payments Awards ( )
  • SAP Pinnacle Awards “New Partner Application Award” (2023)
  • Fintech Breakthrough Awards – Best Fraud Prevention Platform (2023)
Show more

These jobs might be a good fit

23.11.2025
U

Unity Staff Data AI Engineer Israel, Tel-Aviv District, Tel-Aviv

Limitless High-tech career opportunities - Expoint
Develop and execute AI strategies aligned with business objectives. Advise leadership on AI capabilities and potential applications. Guide teams in adopting AI tools and methodologies. Ensure ethical and efficient implementation...
Description:
SuperSonic is hiring a Staff Data AI Lead to lead our AI initiatives.As a Staff Data AI Lead you will be responsible for leading SuperSonic organization AI integration efforts and serving as the bridge between advanced AI technologies and our business needs, implementing cutting-edge Data & AI technologies, creating AI-driven strategies and integrating innovative AI solutions across multiple platforms.
What you'll be doing
  • Develop and execute AI strategies aligned with business objectives
  • Advise leadership on AI capabilities and potential applications
  • Guide teams in adopting AI tools and methodologies
  • Ensure ethical and efficient implementation of AI technologies
  • Design and oversee AI-driven process improvements
  • Collaborate with various departments to identify AI opportunities
  • Stay current with the latest AI trends and advancements
  • Conduct AI-related training and workshops for staff
  • Manage AI projects from conception to implementation
  • Evaluate and recommend AI tools and platforms
  • Leading a team of AI engineers
What we're looking for
  • Deep understanding of AI technologies, including large language models
  • Expertise in prompt engineering and AI-powered automation
  • Proficiency with AI tools such as ChatGPT, Claude, Midjourney, and Copilot
  • Knowledge of AI ethics and regulatory considerations
  • Strong problem-solving and analytical skills
  • Proficiency with Python or TypeScript for building AI workflows and data pipelines
  • Excellent communication and leadership abilities
  • Ability to translate complex AI concepts for non-technical audiences
  • Experience in project management and cross-functional collaboration
You might also have
  • Advanced degree in Computer Science, AI, or related field
  • Previous experience in AI implementation within an organizational setting
  • Certifications in relevant AI technologies or platforms
  • Familiarity with no-code AI application development
  • Bachelor's degree in Computer Science, AI, or related field
Additional information
  • Relocation support is not available for this position.
  • Work visa/immigration sponsorship is not available for this position

This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.

Show more

These jobs might be a good fit

22.11.2025
U

Unity Data Platform Engineering Lead Israel, Tel-Aviv District, Tel-Aviv

Limitless High-tech career opportunities - Expoint
Leading a senior Data Platform team: setting clear objectives, unblocking execution, and raising the engineering bar. Owning SLOs, on-call, incident response, and postmortems for core data services. Designing and operating...
Description:

Unify online/offline for features: Drive Flink adoption and patterns that keep features consistent and low-latency for experimentation and production.

Make self-serve real: Build golden paths, templates, and guardrails so product/analytics/DS engineers can move fast safely.

Run multi-tenant compute efficiently: EMR on EKS powered by Karpenter on Spot instances; right-size Trino/Spark/Druid for performance and cost.

Cross-cloud interoperability: BigQuery + BigLake/Iceberg interop where it makes sense (analytics, experimentation, partnership).

What you'll be doing
  • Leading a senior Data Platform team: setting clear objectives, unblocking execution, and raising the engineering bar.
  • Owning SLOs, on-call, incident response, and postmortems for core data services.
  • Designing and operating EMR on EKS capacity profiles, autoscaling policies, and multi-tenant isolation.
  • Tuning Trino (memory/spill, CBO, catalogs), Spark/Structured Streaming jobs, and Druid ingestion/compaction for sub-second analytics.
  • Extending Flink patterns for the feature platform (state backends, checkpointing, watermarks, backfills).
  • Driving FinOps work: CUR-based attribution, S3 Inventory-driven retention/compaction, Reservations/Savings Plans strategy, OpenCost visibility.
  • Partnering with product engineering, analytics, and data science & ML engineers on roadmap, schema evolution, and data product SLAs.
  • Leveling up observability (Prometheus/VictoriaMetrics/Grafana), data quality checks, and platform self-service tooling.
What we're looking for
  • 2+ years leading engineers (team lead or manager) building/operating large-scale data platforms; 5+ years total in Data Infrastructure/DataOps roles.
  • Proven ownership of cloud-native data platforms on AWS: S3, EMR (preferably EMR on EKS), IAM, Glue/Data Catalog, Athena.
  • Production experience with Apache Iceberg (schema evolution, compaction, retention, metadata ops) and columnar formats (Parquet/Avro).
  • Hands-on depth in at least two of: Trino/Presto, Apache Spark/Structured Streaming, Apache Druid, Apache Flink.
  • Strong conceptual understanding of Kubernetes (EKS), including autoscaling, isolation, quotas, and observability
  • Strong SQL skills and extensive experience with performance tuning, with solid proficiency in Python/Java.
  • Solid understanding of Kafka concepts, hands-on experience is a plus
  • Experience running on-call for data platforms and driving measurable SLO-based improvements.
You might also have
  • Experience building feature platforms (feature definitions, materialization, serving, and online/offline consistency).
  • Airflow (or similar) at scale; Argo experience is a plus.
  • Familiarity with BigQuery (and ideally BigLake/Iceberg interop) and operational DBs like Aurora MySQL.
  • Experience with Clickhouse / Snowflake / Databricks / Starrocks.
  • FinOps background (cost attribution/showback, Spot strategies).
  • Data quality, lineage, and cataloging practices in large orgs.
  • IaC (Terraform/CloudFormation)
Additional information
  • Relocation support is not available for this position.
  • Work visa/immigration sponsorship is not available for this position

This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.

Show more

These jobs might be a good fit

Limitless High-tech career opportunities - Expoint
Build and lead relationships for highly sophisticated customer accounts. Conduct customer needs analysis and anticipate requirements beyond existing solution’s scope. Prepare detailed product specifications to enable the sale of our...
Description:

As a
•Build and lead relationships for highly sophisticated customer accounts
•Conduct customer needs analysis and anticipate requirements beyond existing solution’s scope
•Prepare detailed product specifications to enable the sale of our products and solutions, and deliver impact presentations at customer facilities
•Verify operability of sophisticated product and service configurations within the customer’s environment
• Perform advanced systems integration and provide technical expertise to design and implement the solution•Excellent communication, relationship and leadership skills in industry
•Ability to present to executive level and articulate the Dell Technologies solutions
•In-depth understanding of market, technologies, products and services
•8 to 12 years of related experience in a relationship selling role
•Advanced experience in a relationship selling role
•Bachelor’s degree


Application closing date:Dec. 20th 2025

Show more
The Applied Data Scientist role in Israel requires a multi-faceted skillset. It is an intensely rewarding and incredibly exciting area of work that encompasses the thorough analysis, processing, and implementation of data solutions in order to support the development of real-world business applications. An Applied Data Scientist in Israel must possess a deep understanding of algorithm design, advanced mathematics, machine learning, and deep analytics, to name just a few. Ability to think critically and solve problems creatively is a must-have. A familiarity with data-oriented technologies and techniques, such as Natural Language Processing (NLP) and technologies for large-scale data processing, is also highly desirable. At Expoint, we are looking for a motivated and experienced Applied Data Scientist in the Israel Area. The critical responsibilities of this role lie in the construction, analysis, and implementation of data solutions to solve complex business problems. This will involve harnessing the power of powerful algorithms and applying advanced analytics to glean insights and build machine learning models. The role also requires the Data Scientist to communicate complex findings to non-technical stakeholders in a clear and concise manner. Candidates for the Applied Data Scientist role in Israel must have experience in programming in Python/R/JAVA, leading the implementation of Big Data and ML-as-a-Service tools, practical understanding of probability and statistics, and a developed understanding of mathematics and algorithms. We are looking for an Applied Data Scientist in Israel that is able to innovate data solutions and reporting capabilities with business stakeholders to support short and longer-term objectives at Expoint. This is an exciting and challenging opportunity where every day presents a new challenge.