

Share
Design and deliver distributed systems supporting ingestion, streaming, storage, and governance for eBay’s Data Platform.
Develop services and APIs that power scalable data management and access across multiple clouds.
Contribute to architecture design reviews, ensuring scalability, reliability, and cost efficiency.
Drive operational excellence through observability, automation, and continuous improvement.
Collaborate with analytics, infrastructure, and product teams to align technical delivery with business goals.
Learn and grow in advanced areas such as orchestration, governance, and privacy engineering.
5+ years of experience designing and developing distributed systems or data platforms.
Proficiency in Java or Python , with experience in containerized environments and CI/CD practices.
Hands-on experience with Kafka , Flink , Spark , Delta/Iceberg , and modern data stores (NoSQL or columnar).
Strong understanding of distributed systems fundamentals — performance, reliability, and fault tolerance.
Proven ability to independently deliver complex projects from design to production.
Bachelor’s or Master’s degree in Computer Science, or equivalent practical experience.
Shape the future of eBay’s Core Data Platform powering global analytics, AI, and ML workloads.
Tackle challenging distributed systems problems — scalability, freshness, and multi-cloud reliability.
Join a collaborative, inclusive culture that values curiosity, craftsmanship, and continuous learning.
These jobs might be a good fit

Share
As a on the Dublin team, you’ll play a key role in building, optimizing, and maintaining our Hadoop-based data warehouse and large-scale data pipelines. This is a hands-on engineering role where you’ll collaborate closely with data engineers, analysts, and platform teams to ensure our data platforms are scalable, reliable, and secure.
What you will accomplish
Design, develop, and maintain robust, scalable data pipelines using Hadoop and related ecosystems.
Implement and optimize ETL processes for both batch and streaming data needs across analytics platforms.
Collaborate cross-functionally with analytics, product, and engineering teams to align technical solutions with business priorities.
Ensure data security, reliability, and compliance across the entire infrastructure lifecycle.
Troubleshoot distributed systems and contribute to performance tuning, observability, and operational excellence.
Continuously learn and apply new open-source and cloud-native tools to improve data systems and processes.
What you will bring
6+ years of experience in data engineering, with a strong foundation in distributed data systems.
Proficiency with Apache Kafka, Flink, Hive, Iceberg, and Spark SQL in large-scale environments.
Working knowledge of Apache Airflow for orchestration and workflow management.
Strong programming skills in Python , Java (Spring Boot) , and SQL across various platforms (e.g., Oracle, SQL Server).
Experience with CI/CD, monitoring, and cloud-native tools (e.g., Jenkins, GitHub Actions, Docker, Kubernetes, Prometheus, Grafana).
Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent practical experience).
The cool part
Work on one of eBay’s most impactful data infrastructure platforms, supporting global analytics and insights.
Join a collaborative, innovative engineering culture that embraces open-source and continuous learning.
Solve complex, high-scale data challenges that directly shape how eBay makes data-driven decisions.

Share
The role within the Hadoop team involves:
Scope : Overseeing and enhancing Hadoop-related projects to meet eBay's extensive data scale requirements, creating customer-facing tools, and ensuring smooth integration with other systems.
Impact : Directly influencing eBay's data strategy by enhancing data processing capabilities, improving system performance, and driving innovation across the organization
What you will accomplish:
Enhance System Availability and Scalability: Spearhead efforts to optimize Hadoop-related projects, ensuring the system is highly available and scalable to meet eBay’s growing data demands. Your work will be pivotal in maintaining uninterrupted service and accommodating future growth, supporting eBay’s strategic objectives.
Drive High-Impact Projects: Lead initiatives that directly contribute to eBay’s ability to efficiently process and analyze vast amounts of data. Your enhancements will bolster system performance and reliability, vital for sustaining eBay's competitive edge.
Develop Innovative Solutions: Create and refine customer-facing tools that improve user experience and operational efficiency. Your contributions will streamline data access and management, making it easier for stakeholders to leverage insights.
Enhance Integration: Ensure seamless integration of Hadoop systems with other platforms, fostering a cohesive data ecosystem. Your work will enable cross-functional teams to reduce the manual work, utilize data insights effectively, driving informed decision-making across the organization.
What you will bring:
Bachelor’s degree in Computer Science, Information Technology, or a related field.
Strong programming skills in languages commonly used with Hadoop, such as Java, Scala, Python. Knowledge of common algorithms and data structures to write efficient and optimized code.
Familiarity with Linux/Unix systems, including shell scripting and system commands. Understanding of networking principles, as Hadoop often operates in distributed environments. Overall, the candidate is expected to have analytical skills to tackle complex distributed system challenges and optimize solutions.
[Optional]Experience with big data technologies and frameworks, contribution to related open source projects is a plus.

Share
As a Senior Software Engineer, you will help shape the next generation of our Hadoop-based analytics infrastructure. eBay operates one of the world’s largest Hadoop deployments, with . You’ll join a team of passionate engineers who thrive on building at scale and contributing to open-source innovation.
Lead the design and development of scalable, secure analytics infrastructure aligned with eBay’s platform vision.
Build and optimize production-grade frameworks and features using Hadoop, Spark, and Iceberg.
Contribute to open-source projects that advance both eBay and the broader data community.
Collaborate across engineering teams to drive innovation, resiliency, and performance at massive scale.
Solve complex system challenges with creativity, data-driven thinking, and technical depth.
7+ years of software engineering experience with proven expertise in Java and distributed systems design.
Strong knowledge of Hadoop ecosystem technologies such as Hadoop, Spark, Iceberg, and YuniKorn.
Deep understanding of computer science fundamentals, performance tuning, and concurrency.
Experience working in Linux environments with strong networking and troubleshooting skills.
A collaborative mindset with excellent communication and analytical abilities.
Bachelor’s or Master’s degree in Computer Science, or equivalent experience in the field.
Shape the future of eBay’s Hadoop ecosystem and big-data infrastructure.
Work at global scale, driving analytics that power millions of eBay experiences.
Join a culture that encourages open-source contribution, innovation, and collaboration.

Share
The design, development, and maintenance of traffic management solutions.
Optimize network protocols, configurations, and Points of Presence (PoP).
Develop and implement advanced caching strategies to improve system efficiency.
Develop high-performance applications in C++ and Go.
Deploy and manage scalable systems on Kubernetes.
Implement observability tools and practices to monitor system performance and health.
Collaborate with cross-functional teams to integrate networking solutions.
Monitor and analyze traffic patterns and security threats.
Contribute to and lead open-source projects.
Innovate and develop patented technologies to improve network capabilities.
Stay updated with the latest developments in network technologies, security practices, and software development.
Identify gaps and issues across systems and functional areas, proposes solutions, and drives those resolutions.
Champion best practices and advanced concepts and impact the business by delivering solutions that address business needs.
Lead and empower others, taking responsibility for small projects and collaborating across functional teams to influence change.
Actively seek feedback and ways to improve team performance and projects, demonstrating strong communication skills.
At least 5 years of experience in cloud networking or traffic management.
Strong programming skills in C++ and Go.
Experience with TCP/IP networking
Familiarity with TCP, SSL and HTTP Protocols.
Expertise in using Kubernetes for orchestrating containerized applications.
Experience with observability tools and practices.
Experience with Envoy for traffic control.
Experience contributing to and leading open-source projects.
Certifications in networking, Kubernetes, cyber security, or related fields.
Experience in a high-traffic, large-scale environment.
Familiarity with additional programming languages, including Java, or frameworks.
Proficiency in Agile development methodologies.
Experience in patent creation and innovation.
Experience in implementing caching strategies and optimizing PoPs.

Share
As a Senior Data Development Infrastructure Developer, you will play a pivotal role in designing, developing, and operating high-performant distributed Data+AI job scheduling and execution engine based on Apache Airflow. You will be responsible for customizing, enhancing, and extending the capabilities of Apache Airflow to meet our specific needs in Ebay. You will work closely with the open-source community, potentially contributing to the Airflow codebase and influencing the direction of the project.
What you will accomplish:
Share

Design and deliver distributed systems supporting ingestion, streaming, storage, and governance for eBay’s Data Platform.
Develop services and APIs that power scalable data management and access across multiple clouds.
Contribute to architecture design reviews, ensuring scalability, reliability, and cost efficiency.
Drive operational excellence through observability, automation, and continuous improvement.
Collaborate with analytics, infrastructure, and product teams to align technical delivery with business goals.
Learn and grow in advanced areas such as orchestration, governance, and privacy engineering.
5+ years of experience designing and developing distributed systems or data platforms.
Proficiency in Java or Python , with experience in containerized environments and CI/CD practices.
Hands-on experience with Kafka , Flink , Spark , Delta/Iceberg , and modern data stores (NoSQL or columnar).
Strong understanding of distributed systems fundamentals — performance, reliability, and fault tolerance.
Proven ability to independently deliver complex projects from design to production.
Bachelor’s or Master’s degree in Computer Science, or equivalent practical experience.
Shape the future of eBay’s Core Data Platform powering global analytics, AI, and ML workloads.
Tackle challenging distributed systems problems — scalability, freshness, and multi-cloud reliability.
Join a collaborative, inclusive culture that values curiosity, craftsmanship, and continuous learning.
These jobs might be a good fit