Key Responsibilities
- Responsible for owning the design and delivery of entire Data Science solutions based on business requirements, bringing them from the initial idea-generation phase to implementation independently.
- Responsible for working autonomously on consuming data, preparing it for data science use, translating business problems to data stories and solving them effectively by applying suitable statistical data analysis and modeling methods. Leverages input from business stakeholders in all of the above to determine the right technical solutions to deliver customer value.
- Responsible for working primarily within the scope of their team to deliver work that informs business decisions for the product / business topic they currently work on, occasionally collaborating across multiple teams.
- Responsible for designing and interpreting quantitative experiments to objectively guide key business decision making.
- Responsible for independently identifying and managing stakeholders in operational, project-based, and managerial roles. Effectively communicates, addresses stakeholder needs, and conveys complex analysis results in a clear manner. Begins to guide junior colleagues in stakeholder engagement.
- Responsible for being flexible in adopting existing internal and external Data Science approaches, proposing and spotting opportunities to apply new approaches and expanding their technical competencies and their peer's when a more efficient way presents itself.
- Responsible for leveraging Data Science for impact while learning to incorporate scalability, reproducibility and long term orientation in their work.
- Responsible for acting as a force multiplier for junior peers in the team by actively helping them with their craft through means of coaching, mentoring and setting a good example.
- Responsible for contributing to the community of their area by supporting community projects aimed at making other data professional more effective, actively participating in community strengthening activities (ie. recruitment) and connecting with peers beyond their area.
- Must be knowledgeable about the larger data ecosystem at Booking and how it relates to the work they do.
- Responsible for collaborating with peers in related crafts (ie. Data Engineering), anticipating data needs to ensure smooth progress of their individual projects.
- Must be knowledgeable with the operational, tactical and strategic goals of their area (ie. track / vertical) and have high level knowledge of the goals of their wider area (ie. Business Unit).
- Responsible for ensuring quality of their own work by validating it through peer review.
- Responsible for influencing business decisions within their specific area (ie. product team).
Communication.Stakeholder
- Business Stakeholders
- Craft team
- Managers
- Leadership team
- DSnA Community
- Other tech crafts
Communication.Type
- Cooperation, InformationInput, feedback and providing data outcomes to steer the business
- CooperationSupport & collaboration
- Cooperation, InformationSupport / Input,feedback and providing data outcomes to steer the business business.
- InformationProviding data to steer the business
- Cooperation, InformationKnowledge sharing, collaboration and support
- InformationData Collection & Tracking
Communication.Frequency
- Frequent
- Continuous
- Frequent
- Occasional
- Frequent
- Occasional
Level of Education.Level of Education
- Master degree
Years of relevant Job Knowledge.Years of relevant Job Knowledge
- Broad Job Knowledge (3 - 5 years)
Requirements of special knowledge/skills
- Professional experience in the broad field of Data Engineering and hold a Master’s degree in a quantitative field.
- Experience working directly in Analytics or Data Science, or have collaborated on E2E projects with colleagues specializing in these topics before
- Working knowledge of the statistics behind A/B testing and basic modeling techniques such as linear regression.
- Experience designing systems E2E, going from problem statements via ideation to data and metrics in production.
- Proven knowledge of Python and Spark
- Knowledge of Hadoop, Hive, Oozie, Kafka, Airflow.
- Experience with Data Warehousing and ETL/ELT pipelines at scale
- Excellent communication skills with both technical and product stakeholders - verbal and written
- Experience solving real problems using data mining techniques and with statistical rigor
- Strong technical skills regarding data analysis, statistics, machine learning and programming. Strong working knowledge of Hadoop, Spark, SQL, Python and/or R.
- Strong data visualization skills
- Strong skills in storytelling with data
Pre-Employment Screening
If your application is successful, your personal data may be used for a pre-employment screening check by a third party as permitted by applicable law. Depending on the vacancy and applicable law, a pre-employment screening may include employment history, education and other information (such as media information) that may be necessary for determining your qualifications and suitability for the position.