The point where experts and best companies meet
Share
We are looking for an Applied Scientist to push the boundaries of machine learning and simulation at scale. As a member of PXT Finance, you will own the development of distributed simulations for population dynamics forecasting and expense planning. This will include the development of state-of-the-art time series and survival models leveraging deep learning techniques such as Graph Neural Networks, Recurrent Neural Networks, and Transformers. Working on a cross functional team, you will collaborate closely with data scientists, software engineers, product managers and finance managers. This is a high-impact role where you will develop applications that will be used by planners and decision makers across Amazon.Key job responsibilities
- Lead the development of distributed simulations for population dynamics forecasting and expense planning.
- Implement state-of-the-art graph deep learning models for tasks such as node survival analysis and spatio-temporal time series forecasting.- Leverage big data processing frameworks such as Apache Spark to ingest high volume headcount data into machine learning workflows.
- Apply GPU programming frameworks such as PyTorch and CUDA to deploy fast and scalable planning solutions.
- Push the boundary of machine learning-based simulation and publish results in peer-reviewed journals.
A day in the life
As an Applied Scientist in Finance you will focus on data science and machine learning engineering workflows. This means ownership of models from the idea stage all the way to production deployment. In addition to programming and system design you'll also work with customers, partners, and leaders to guide project decisions
- 3+ years of building models for business application experience
- PhD, or Master's degree and 4+ years of CS, CE, ML or related field experience
- Experience programming in Java, C++, Python or related language
- Experience in any of the following areas: algorithms and data structures, parsing, numerical optimization, data mining, parallel and distributed computing, high-performance computing
- Experience in professional software development
- Experience using big data frameworks such as Spark and Hadoop
These jobs might be a good fit