Expoint – all jobs in one place
Finding the best job has never been easier

Experienced Big Data Developer jobs in Israel

Unlock your potential in the high tech industry with Expoint. Search for job opportunities as a Experienced Big Data Developer in Israel and join the network of leading companies. Start your journey today and find your dream job as a Experienced Big Data Developer with Expoint.
Company
Job type
Job categories
Job title (1)
Israel
City
590 jobs found
Yesterday
U

Unity Senior DevOps Engineer Data Platform Israel, Tel-Aviv District, Tel-Aviv

Limitless High-tech career opportunities - Expoint
Technical Leadership & Architecture: Drive data infrastructure strategy and establish standardized patterns for AI/ML workloads, with direct influence on architectural decisions across data and engineering teams. DataOps Excellence: Create seamless...
Description:
The opportunity
  • Technical Leadership & Architecture: Drive data infrastructure strategy and establish standardized patterns for AI/ML workloads, with direct influence on architectural decisions across data and engineering teams
  • DataOps Excellence: Create seamless developer experience through self-service capabilities while significantly improving data engineer productivity and pipeline reliability metrics
  • Cross-Functional Innovation: Lead collaboration between DevOps, Data Engineering, and ML Operations teams to unify our approach to infrastructure as code and orchestration platforms
  • Technology Breadth & Growth: Work across the full DataOps spectrum from pipeline orchestration to AI/ML infrastructure, with clear advancement opportunities as a senior infrastructure engineer
  • Strategic Business Impact: Build scalable analytics capabilities that provide direct line of sight between your infrastructure work and business outcomes through reliable, cutting-edge data solutions
What you'll be doing
  • Design Data-Native Cloud Solutions - Design and implement scalable data infrastructure across multiple environments using Kubernetes, orchestration platforms, and IaC to power our AI, ML, and analytics ecosystem
  • Define DataOps Technical Strategy - Shape the technical vision and roadmap for our data infrastructure capabilities, aligning DevOps, Data Engineering, and ML teams around common patterns and practices
  • Accelerate Data Engineer Experience - Spearhead improvements to data pipeline deployment, monitoring tools, and self-service capabilities that empower data teams to deliver insights faster with higher reliability
  • Engineer Robust Data Platforms - Build and optimize infrastructure that supports diverse data workloads from real-time streaming to batch processing, ensuring performance and cost-effectiveness for critical analytics systems
  • Drive DataOps Excellence - Collaborate with engineering leaders across data teams, champion modern infrastructure practices, and mentor team members to elevate how we build, deploy, and operate data systems at scale
What we're looking for
  • 3+ years of hands-on DevOps experience building, shipping, and operating production systems.
  • Coding proficiency in at least one language (e.g., Python or TypeScript); able to build production-grade automation and tools.
  • Cloud platforms: deep experience with AWS, GCP, or Azure (core services, networking, IAM).
  • Kubernetes: strong end-to-end understanding of Kubernetes as a system (routing/networking, scaling, security, observability, upgrades), with proven experience integrating data-centric components (e.g., Kafka, RDS, BigQuery, Aerospike).
  • Infrastructure as Code: design and implement infrastructure automation using tools such as Terraform, Pulumi, or CloudFormation (modular code, reusable patterns, pipeline integration).
  • GitOps & CI/CD: practical experience implementing pipelines and advanced delivery using tools such as Argo CD / Argo Rollouts, GitHub Actions, or similar.
  • Observability: metrics, logs, and traces; actionable alerting and SLOs using tools such as Prometheus, Grafana, ELK/EFK, OpenTelemetry, or similar.
You might also have
  • Data Pipeline Orchestration - Demonstrated success building and optimizing data pipeline deployment using modern tools (Airflow, Prefect, Kubernetes operators) and implementing GitOps practices for data workloads
  • Data Engineer Experience Focus - Track record of creating and improving self-service platforms, deployment tools, and monitoring solutions that measurably enhance data engineering team productivity
  • Data Infrastructure Deep Knowledge - Extensive experience designing infrastructure for data-intensive workloads including streaming platforms (Kafka, Kinesis), data processing frameworks (Spark, Flink), storage solutions, and comprehensive observability systems
Additional information
  • Relocation support is not available for this position.
  • Work visa/immigration sponsorship is not available for this position

This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.

Show more
Yesterday
F

Forter Data ResearcherNew Israel, Tel Aviv District, Tel Aviv-Yafo

Limitless High-tech career opportunities - Expoint
Invent, design, implement, and refine our system’s core decisioning logic and models in a live production environment. Conduct in-depth research into complex fraud patterns, adversarial networks, and emerging global threats....
Description:

What you'll be doing:

  • Invent, design, implement, and refine our system’s core decisioning logic and models in a live production environment.
  • Conduct in-depth research into complex fraud patterns, adversarial networks, and emerging global threats.
  • Leverage rich datasets to derive actionable insights, develop new system components, and advance our feature engineering processes.
  • Develop, prototype, and automate new tools and processes to enhance the precision and scale of our systems.
  • Collaborate with a world-class team of Data Scientists, Analysts, Researchers, and Engineers to develop the next generation of Forter’s AI technology.

What you'll need:

  • Relevant experience (one of the below)
    • At least 2 years of hands-on experience in quantitative research/data science, or a related role involving production-oriented data analytics and hypothesis-led research.
    • MSc or PhD in a quantitative field (e.g., Physics, Economics, Neuroscience, Biotechnology, Computer Science, etc.).
  • Strong analytical and logical reasoning skills with a proven ability to dissect and solve highly complex problems.
  • Extensive experience working with large datasets using scripting languages (e.g., Python, R, or Matlab).
  • Excellent communication skills – ability to articulate complex technical concepts and research findings to diverse audiences.
Bonus points for:
  • Experience with SQL and big data technologies (e.g., Spark).
  • Deep familiarity with machine learning concepts and practice
  • Risk / Intelligence experience

Trust is backed by data – Forter is a recipient of over 10 workplace and innovation awards, including:

  • Great Place to Work Certification (2021, 2022, 2023, )
  • Fortune’s Best Workplaces in NYC (2022, 2023 and )
  • Forbes Cloud 100 (2021, 2022, 2023 and )
  • #3 on Fast Company’s list of “Most Innovative Finance Companies” ( )
  • Anti-Fraud Solution of the Year at the Payments Awards ( )
  • SAP Pinnacle Awards “New Partner Application Award” (2023)
  • Fintech Breakthrough Awards – Best Fraud Prevention Platform (2023)
Show more

These jobs might be a good fit

Yesterday
R

Rapyd Software Developer Israel

Limitless High-tech career opportunities - Expoint
Develop software systems to improve processes in the Finance Department, using Python and React. Collaborate with finance staff and accountants to understand business needs and adapt solutions accordingly. Provide quick...
Description:
Description

Get the tools to grow globally at . Follow: , , ,

Key Responsibilities:

  • Develop software systems to improve processes in the Finance Department, using Python and React.
  • Collaborate with finance staff and accountants to understand business needs and adapt solutions accordingly.
  • Provide quick and effective solutions to technological problems, while maintaining code quality and system reliability.
  • Take part in the full development lifecycle, including planning, development, integration, and maintenance.
  • Meet deadlines and execute plans according to the department’s changing needs.
  • Work independently and proactively, taking responsibility for assigned tasks.
  • Communicate progress and updates to relevant stakeholders.
Requirements
  • Experience or familiarity with backend development in Python.
  • Experience or familiarity with frontend development in React.
  • Experience with ETL tools.
  • Willingness and ability to quickly learn new technologies and tools as required.
  • Ability to work independently and responsibly.
  • Strong analytical and problem-solving skills.
  • Good interpersonal and teamwork skills.
  • High level of English.

Advantages:

  • Experience in a finance department or financial sector.
  • Experience with AWS technologies or ERP systems.
  • Initiative and willingness to learn quickly.
  • Creative problem-solving skills.
  • Commitment to high-quality work and meeting deadlines.
Show more

These jobs might be a good fit

Yesterday
R

Rapyd Senior Backend Developer Israel

Limitless High-tech career opportunities - Expoint
Architecture & Development: Lead the design and development of robust, scalable, and secure backend systems and APIs, primarily using Python. Cloud Infrastructure: Architect, build, and maintain high-scale solutions on modern...
Description:
Description

Get the tools to grow globally at . Follow: , , ,

Working with cutting-edge technologies like Python, Node.js, and AWS, you will build innovative, low-latency, and highly available systems that form the core of our business. In this role, you will be instrumental in shaping our technical architecture and driving excellence in a fast-paced, evolving environment.

Key Responsibilities:

  • Architecture & Development: Lead the design and development of robust, scalable, and secure backend systems and APIs, primarily using Python.
  • Cloud Infrastructure: Architect, build, and maintain high-scale solutions on modern cloud platforms (specifically AWS) to ensure reliability, scalability, and performance.
  • System Performance: Design and implement services optimized for low-latency processing, high availability, and fault tolerance for mission-critical financial applications.
  • Technical Leadership: Provide technical leadership and mentorship to junior and mid-level developers, fostering best practices in code quality, testing, and maintainability.
  • Product Focus: Develop and enhance Compliance and Risk management products for the Fintech sector, ensuring they meet strict regulatory requirements and industry standards.
  • Optimization & Reliability: Take ownership of system performance, scalability, and reliability. Proactively identify and resolve bottlenecks in code and architecture.
  • Collaboration: Work closely with cross-functional teams, including frontend developers, DevOps engineers, and product managers, to deliver seamless end-to-end solutions.
  • Innovation: Stay current with emerging technologies and industry trends, driving the adoption of new tools and frameworks to solve complex challenges effectively.
Requirements
  • Experience: 5-8 years of professional backend development experience.
  • Python Proficiency: Expert-level knowledge of Python and its ecosystem.
  • Frameworks: Strong experience with web frameworks such as Django, Flask, Sanic, or FastAPI.
  • Cloud Computing: Proven experience designing, deploying, and managing applications on AWS (e.g., EC2, S3, Lambda, RDS, ECS/EKS).
  • Databases: Proficiency with both relational (e.g., PostgreSQL, MySQL) and NoSQL (e.g., MongoDB, DynamoDB) databases.
  • Architecture: Solid understanding of microservices architecture, RESTful APIs, and event-driven systems.
  • Mentorship: Demonstrated ability to mentor and guide other engineers.
  • Problem Solving: Excellent analytical and problem-solving skills with a strong sense of ownership.

Preferred Qualifications (Nice to Have)

  • Experience with Node.js.
  • Previous experience in the Fintech, RegTech (Regulatory Technology), or financial services industry.
  • Hands-on experience with containerization technologies like Docker and Kubernetes.
Show more

These jobs might be a good fit

Yesterday
U

Unity Staff Data AI Engineer Israel, Tel-Aviv District, Tel-Aviv

Limitless High-tech career opportunities - Expoint
Develop and execute AI strategies aligned with business objectives. Advise leadership on AI capabilities and potential applications. Guide teams in adopting AI tools and methodologies. Ensure ethical and efficient implementation...
Description:
SuperSonic is hiring a Staff Data AI Lead to lead our AI initiatives.As a Staff Data AI Lead you will be responsible for leading SuperSonic organization AI integration efforts and serving as the bridge between advanced AI technologies and our business needs, implementing cutting-edge Data & AI technologies, creating AI-driven strategies and integrating innovative AI solutions across multiple platforms.
What you'll be doing
  • Develop and execute AI strategies aligned with business objectives
  • Advise leadership on AI capabilities and potential applications
  • Guide teams in adopting AI tools and methodologies
  • Ensure ethical and efficient implementation of AI technologies
  • Design and oversee AI-driven process improvements
  • Collaborate with various departments to identify AI opportunities
  • Stay current with the latest AI trends and advancements
  • Conduct AI-related training and workshops for staff
  • Manage AI projects from conception to implementation
  • Evaluate and recommend AI tools and platforms
  • Leading a team of AI engineers
What we're looking for
  • Deep understanding of AI technologies, including large language models
  • Expertise in prompt engineering and AI-powered automation
  • Proficiency with AI tools such as ChatGPT, Claude, Midjourney, and Copilot
  • Knowledge of AI ethics and regulatory considerations
  • Strong problem-solving and analytical skills
  • Proficiency with Python or TypeScript for building AI workflows and data pipelines
  • Excellent communication and leadership abilities
  • Ability to translate complex AI concepts for non-technical audiences
  • Experience in project management and cross-functional collaboration
You might also have
  • Advanced degree in Computer Science, AI, or related field
  • Previous experience in AI implementation within an organizational setting
  • Certifications in relevant AI technologies or platforms
  • Familiarity with no-code AI application development
  • Bachelor's degree in Computer Science, AI, or related field
Additional information
  • Relocation support is not available for this position.
  • Work visa/immigration sponsorship is not available for this position

This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.

Show more

These jobs might be a good fit

22.11.2025
U

Unity Senior Backend Developer Israel, Tel-Aviv District, Tel-Aviv

Limitless High-tech career opportunities - Expoint
Take ownership of major projects and core features with significant business impact. Work alongside skilled engineers in a fast-paced, high-scale environment. Tackle challenging technical problems using cutting-edge tools like Kafka...
Description:
The opportunity

Why You'll Love This Role:

  • Take ownership of major projects and core features with significant business impact.
  • Work alongside skilled engineers in a fast-paced, high-scale environment.
  • Tackle challenging technical problems using cutting-edge tools like Kafka and Aerospike.
What you'll be doing
  • Design, develop, and monitor new features for the platform, including components supporting targeting, attribution, and budget management.
  • Maintain and optimize the performance of critical backend systems.
  • Refactor and improve legacy code to ensure scalability and efficiency.
  • Lead efforts to guarantee system reliability and low latency in a high-concurrency environment.
  • Collaborate with peers and across teams to deliver high-quality, production-ready solutions.
What we're looking for
  • Proficiency in backend development with Scala (highly preferred) or Java.
  • Strong understanding of distributed systems and real-time, high-concurrency applications.
  • Experience with Kafka and other streaming/messaging systems.
  • Extensive experience debugging and optimizing high-scale production environments.
  • Proven ability to design and build backend systems focused on performance and scalability.
You might also have
  • Hands-on experience with Aerospike or other NoSQL databases.
  • Familiarity with Kubernetes or container orchestration tools.
  • Knowledge of functional programming paradigms.
  • If you're looking for a role where you can contribute meaningfully, tackle exciting challenges, and work with a friendly, professional team that values collaboration and expertise, we’d love to hear from you!
Additional information
  • Relocation support is not available for this position.
  • Work visa/immigration sponsorship is not available for this position

This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.

Show more

These jobs might be a good fit

22.11.2025
U

Unity Data Platform Engineering Lead Israel, Tel-Aviv District, Tel-Aviv

Limitless High-tech career opportunities - Expoint
Leading a senior Data Platform team: setting clear objectives, unblocking execution, and raising the engineering bar. Owning SLOs, on-call, incident response, and postmortems for core data services. Designing and operating...
Description:

Unify online/offline for features: Drive Flink adoption and patterns that keep features consistent and low-latency for experimentation and production.

Make self-serve real: Build golden paths, templates, and guardrails so product/analytics/DS engineers can move fast safely.

Run multi-tenant compute efficiently: EMR on EKS powered by Karpenter on Spot instances; right-size Trino/Spark/Druid for performance and cost.

Cross-cloud interoperability: BigQuery + BigLake/Iceberg interop where it makes sense (analytics, experimentation, partnership).

What you'll be doing
  • Leading a senior Data Platform team: setting clear objectives, unblocking execution, and raising the engineering bar.
  • Owning SLOs, on-call, incident response, and postmortems for core data services.
  • Designing and operating EMR on EKS capacity profiles, autoscaling policies, and multi-tenant isolation.
  • Tuning Trino (memory/spill, CBO, catalogs), Spark/Structured Streaming jobs, and Druid ingestion/compaction for sub-second analytics.
  • Extending Flink patterns for the feature platform (state backends, checkpointing, watermarks, backfills).
  • Driving FinOps work: CUR-based attribution, S3 Inventory-driven retention/compaction, Reservations/Savings Plans strategy, OpenCost visibility.
  • Partnering with product engineering, analytics, and data science & ML engineers on roadmap, schema evolution, and data product SLAs.
  • Leveling up observability (Prometheus/VictoriaMetrics/Grafana), data quality checks, and platform self-service tooling.
What we're looking for
  • 2+ years leading engineers (team lead or manager) building/operating large-scale data platforms; 5+ years total in Data Infrastructure/DataOps roles.
  • Proven ownership of cloud-native data platforms on AWS: S3, EMR (preferably EMR on EKS), IAM, Glue/Data Catalog, Athena.
  • Production experience with Apache Iceberg (schema evolution, compaction, retention, metadata ops) and columnar formats (Parquet/Avro).
  • Hands-on depth in at least two of: Trino/Presto, Apache Spark/Structured Streaming, Apache Druid, Apache Flink.
  • Strong conceptual understanding of Kubernetes (EKS), including autoscaling, isolation, quotas, and observability
  • Strong SQL skills and extensive experience with performance tuning, with solid proficiency in Python/Java.
  • Solid understanding of Kafka concepts, hands-on experience is a plus
  • Experience running on-call for data platforms and driving measurable SLO-based improvements.
You might also have
  • Experience building feature platforms (feature definitions, materialization, serving, and online/offline consistency).
  • Airflow (or similar) at scale; Argo experience is a plus.
  • Familiarity with BigQuery (and ideally BigLake/Iceberg interop) and operational DBs like Aurora MySQL.
  • Experience with Clickhouse / Snowflake / Databricks / Starrocks.
  • FinOps background (cost attribution/showback, Spot strategies).
  • Data quality, lineage, and cataloging practices in large orgs.
  • IaC (Terraform/CloudFormation)
Additional information
  • Relocation support is not available for this position.
  • Work visa/immigration sponsorship is not available for this position

This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.

Show more

These jobs might be a good fit

Limitless High-tech career opportunities - Expoint
Technical Leadership & Architecture: Drive data infrastructure strategy and establish standardized patterns for AI/ML workloads, with direct influence on architectural decisions across data and engineering teams. DataOps Excellence: Create seamless...
Description:
The opportunity
  • Technical Leadership & Architecture: Drive data infrastructure strategy and establish standardized patterns for AI/ML workloads, with direct influence on architectural decisions across data and engineering teams
  • DataOps Excellence: Create seamless developer experience through self-service capabilities while significantly improving data engineer productivity and pipeline reliability metrics
  • Cross-Functional Innovation: Lead collaboration between DevOps, Data Engineering, and ML Operations teams to unify our approach to infrastructure as code and orchestration platforms
  • Technology Breadth & Growth: Work across the full DataOps spectrum from pipeline orchestration to AI/ML infrastructure, with clear advancement opportunities as a senior infrastructure engineer
  • Strategic Business Impact: Build scalable analytics capabilities that provide direct line of sight between your infrastructure work and business outcomes through reliable, cutting-edge data solutions
What you'll be doing
  • Design Data-Native Cloud Solutions - Design and implement scalable data infrastructure across multiple environments using Kubernetes, orchestration platforms, and IaC to power our AI, ML, and analytics ecosystem
  • Define DataOps Technical Strategy - Shape the technical vision and roadmap for our data infrastructure capabilities, aligning DevOps, Data Engineering, and ML teams around common patterns and practices
  • Accelerate Data Engineer Experience - Spearhead improvements to data pipeline deployment, monitoring tools, and self-service capabilities that empower data teams to deliver insights faster with higher reliability
  • Engineer Robust Data Platforms - Build and optimize infrastructure that supports diverse data workloads from real-time streaming to batch processing, ensuring performance and cost-effectiveness for critical analytics systems
  • Drive DataOps Excellence - Collaborate with engineering leaders across data teams, champion modern infrastructure practices, and mentor team members to elevate how we build, deploy, and operate data systems at scale
What we're looking for
  • 3+ years of hands-on DevOps experience building, shipping, and operating production systems.
  • Coding proficiency in at least one language (e.g., Python or TypeScript); able to build production-grade automation and tools.
  • Cloud platforms: deep experience with AWS, GCP, or Azure (core services, networking, IAM).
  • Kubernetes: strong end-to-end understanding of Kubernetes as a system (routing/networking, scaling, security, observability, upgrades), with proven experience integrating data-centric components (e.g., Kafka, RDS, BigQuery, Aerospike).
  • Infrastructure as Code: design and implement infrastructure automation using tools such as Terraform, Pulumi, or CloudFormation (modular code, reusable patterns, pipeline integration).
  • GitOps & CI/CD: practical experience implementing pipelines and advanced delivery using tools such as Argo CD / Argo Rollouts, GitHub Actions, or similar.
  • Observability: metrics, logs, and traces; actionable alerting and SLOs using tools such as Prometheus, Grafana, ELK/EFK, OpenTelemetry, or similar.
You might also have
  • Data Pipeline Orchestration - Demonstrated success building and optimizing data pipeline deployment using modern tools (Airflow, Prefect, Kubernetes operators) and implementing GitOps practices for data workloads
  • Data Engineer Experience Focus - Track record of creating and improving self-service platforms, deployment tools, and monitoring solutions that measurably enhance data engineering team productivity
  • Data Infrastructure Deep Knowledge - Extensive experience designing infrastructure for data-intensive workloads including streaming platforms (Kafka, Kinesis), data processing frameworks (Spark, Flink), storage solutions, and comprehensive observability systems
Additional information
  • Relocation support is not available for this position.
  • Work visa/immigration sponsorship is not available for this position

This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.

Show more
Do you want to take your tech skills to the next level? If you’re an experienced big data developer seeking an opportunity to work in Israel, then look no further than Expoint. At Expoint, we provide experienced big data developers with access to a wide range of cutting-edge projects from top employers throughout Israel. Our platform is tailored to help you transition into a big data development role, allowing you to leverage the skills you already have to solve complex challenges. With a big data development role from Expoint, you get access to the most advanced technologies from top-tier Israeli companies. Leverage Hadoop, Spark, and other Big Data development languages to create powerful solutions. Plus, you can take advantage of Expoint's network of experienced developers who will help you stay current with the ever-evolving field of big data development. Whether you’re a junior developer seeking to expand your skill set or a seasoned professional looking for a challenge, Expoint is the perfect platform. We offer a robust salary package, flexible schedules, access to innovative projects, and the chance to build your digital profile through your professional network. Make a name for yourself as an experienced big data developer in Israel. Expoint is your one-stop shop for finding the perfect big data development role in Israel. Sign up today and take the first steps towards beginning your tech career in Israel. With Expoint, you’ll have everything you need to succeed!