In this role, you will innovate and operationalize data pipelines in a modern cloud environment (e.g., AWS, Snowflake), automate workflows (Dagster-centric), manage infrastructure as code (Terraform), and establish robust, auditable CI/CD practices. You’ll partner closely with Data Engineering, FP&A Reporting and analytics teams to deliver timely, reliable, and secure data solutions critical for regulatory (SOx) and business reporting.
Key Job Responsibilities and Duties:
Producing curated, reusable analytical data products to enable self-serve analytics for many internal customers across departments.
Modeling data following best practices and Data Warehousing methodologies such as Data Vault and (Kimball) Dimensional modeling.
Transforming large, complex data sets into pragmatic, actionable insights and providing them in a consumable format for historical or predictive analysis.
Maintaining and tuning data pipeline health, including troubleshooting issues, implementing data quality controls, monitoring performance, and proactively addressing issues and risks.
Leading the technical resolution of problems, and communicating them to both technical and non-technical audiences.
Supporting product teams in defining the Data Architecture for their domains, from conceptual to physical modeling in the Data Warehouse.
Driving the culture across the business unit for data quality and data governance and its best practices.
Driving the implementation of reliable and well trusted metrics defined by the business, connecting disparate datasets into unified data products in the Lakehouse and/or Data Warehouse.
Performing Data Governance responsibilities such as technical stewardship, data classification, compliance management, data quality monitoring, and security considerations.
Working alone and self-steering initiatives, defining and breaking down work for more junior members of the team.
Mapping data flows between systems and workflows across the company to improve efficiency and resilience.
Developing scalable, real-time event-based streaming data pipelines to support internal and customer-facing use cases.
Ensuring ongoing reliability and performance of data pipelines through proactive monitoring, end-to-end testing standards, and incident handling.
Writing maintainable, reusable code by applying standard libraries and design patterns, and refactoring for simplicity and clarity.
Developing scalable and extensible physical data models aligned with operational workflows and infrastructure constraints.
Owning end-to-end data applications by defining and tracking SLIs and SLOs to ensure reliability and quality.
Qualifications & Skills:
Minimum of 5 years of experience as a Data Engineer or a similar role, with a consistent record of successfully delivering data solutions.
You have built production data pipelines in the cloud, setting up data-lake and server-less solutions; you have hands-on experience with schema design and data modeling and working with business stakeholders and data engineers to provide production level data solutions.
Experience and/or knowledge on designing and implementing mature Data Warehouse pipelines using Data Vault and/or Dimensional modeling methodologies is a must.
Working with ETL/ELT tools and methodologies.
Working with relational databases and any flavor of SQL in an analytical context.
Building data exploration/visualization and designing data story telling.
Communicating effectively (written and spoken) and stakeholder management.
Writing and maintaining high-quality and reusable code, applying design patterns and meeting coding standards.
Comfortable with working in a DevOps / DataOps environment.
Have proven records of working with workflow management and scheduling tools such as Apache Airflow and/or Dagster.
Booking.com’s Total Rewards Philosophy is not only about compensation but also about benefits. We offer a competitive , as well unique-to-Booking.com benefits which include:
Annual paid time off and generous paid leave scheme including: parent, grandparent, bereavement, and care leave
Hybrid working including flexible working arrangements, and up to 20 days per year working from abroad (home country)
Industry leading product discounts - up to 1400 per year - for yourself, including automatic Genius Level 3 status and Booking.com wallet credit
Application Process:
Let’s go places together:
Detailed instructions on post-application requirements including any required application materials, deadlines, portfolios, coding challenges, or other assessments as defined by BU or department.
Pre-Employment Screening
If your application is successful, your personal data may be used for a pre-employment screening check by a third party as permitted by applicable law. Depending on the vacancy and applicable law, a pre-employment screening may include employment history, education and other information (such as media information) that may be necessary for determining your qualifications and suitability for the position.