Card Risk Modeling - AI ML Sr. Associate
Job Responsibilities-
- Design and develop machine learning models to drive impactful credit decisions for the card business throughout the credit card lifecycle (e.g., acquisition, account management, transaction authorization, collection)
- Leverage cutting-edge machine learning techniques, including deep learning architectures on big data platforms with key emphasis on interpretability and replicability of such techniques
- Work closely with the senior management team to develop ambitious, innovative modeling solutions and implement them in production to drive significant business impact
- Collaborate with various business partners in marketing, risk, technology, model governance, compliance etc. throughout the entire modeling lifecycle (development, review, deployment and ongoing monitoring)
Required qualifications, capabilities and skills-
- Ph.D. or Master’s degree from an accredited university in a quantitative field such as Computer Science, Mathematics, Statistics, Econometrics, or Engineering
- Exceptional coding skills with at least one year professional experience in coding (e.g. Python, SAS, Spark, Scala, or Tensorflow) and big data platform (e.g., Hadoop, HDFS, Teradata, snowflake, AWS cloud, Hive)
- Solid understanding of advanced statistical methods and machine learning techniques: GLM/Regression, Random Forest, Boosting Trees, Neural Network, Clustering, KNN, Anomaly Detection etc.
- Strong ability to interpret and form a coherent story with complex data and communicate to a wide range of audience with various degree of technical acumen including senior leadership and executives
- Advanced problem-solving skills and exceptional analytical skills.
Preferred qualifications, capabilities and skills -
- Experience in credit card industry with strong business acumen
- Experience in interpreting / explaining machine learning models such as XGBoost, GBM etc.
- Strong ownership and execution; proven experience in implementing models in production
- Expertise in data wrangling and model building on a distributed Spark computation environment