Expoint - all jobs in one place

Finding the best job has never been easier

Limitless High-tech career opportunities - Expoint

PayPal Data Engineer 
France, Auvergne-Rhône-Alpes 
554740168

18.08.2024

What you need to know about the role-PayPal Marketing Technology team is dedicated to creating a best-in-class platform. We are looking for highly talented, professional, and motivated engineers to join our team.As a Data Engineer 3 on our Marketing Technology Platform, you will be at the forefront of designing and developing backend data pipelines using GCP (BigQuery, Bigtable and Dataproc), Python programming. As part of your responsibilities, you’ll engage in all facets of data pipeline development, including design, coding, ensuring security measures, testing, and overseeing production releases. You will be responsible for delivering new features, enhancements, and fixes for our Data Pipelines within MarTech platform.


Your day to day

  • Building highly scalable backend data pipeline with high throughput in GCP using BigQuery and Dataproc.
  • Write and maintain pipelines in Python framework
  • Scheduling jobs using UC4, airflow
  • Independently work on multiple product features, utilizing your technical
    expertise to propose innovative solutions for both new and existing
    functionalities, informed by a growing understanding of our products and the
    business domain.
  • Manage your own project deliverables, timelines, and priorities, effectively
    balancing multiple tasks to meet project deadlines and performance targets.
  • Actively engage in design and code reviews, providing constructive feedback to
    peers and incorporating feedback into your own work to maintain high standards
    of code quality and functionality.
  • Sharing your knowledge and experience to new members to help onboard them onto the team quickly and efficiently, fostering a culture of learning and

What do you need to bring-

  • A bachelor’s degree in computer science or an equivalent combination of
    technical education and work experience.
  • 5+ years of ETL Expertise i.e. managing data extraction, transformation, and loading from various sources using advanced SQL and Jupyter Notebooks/Python.
  • Working knowledge on Big Data, GCP Cloud databases, Streaming Integrations.
  • Experience in design and building highly scalable distributed applications capable of handling very high volume of data in GCP using BigQuery and python.
  • Strong conceptual knowledge in Data warehouses, Data marts, distributed data platforms and data lakes, Data Modeling, Schema design and CI/CD
  • Experience working on SaaS platform(s): Adobe RTCDP is a plus.
  • Experience using Atlassian JIRA, Service Now, Atlassian Confluence tools.
  • Experience in delivering projects using Agile Methodology.

Our Benefits:

Any general requests for consideration of your skills, please