Job Responsibilities:
- Land data from various firm sources into big data warehouse
- Investigate data issues, provide support on data issues
- Develop automation for data extraction.
- Design and tune schema for data landed on platform
- Partner with information modelling teams on firmwide logical data models
- Drive primary subject matter expert (SME) for data in analytics platform
- Develop data quality rules and controls for data
- Analyze and solve query performance bottlenecks in Cloud based warehouses like Redshift and AWS Glue
Required qualifications, capabilities and skills:
- Formal training or certification on Computer Science concepts and 3+ years applied experience
- Strong hands on coding in Python, Java and Apache Spark & SQL
- Strong CS fundamentals, data structures, algorithms with good understanding of big data
- Experience with AWS application development including services such as Lambda, Glue, ECS/EKS
- Excellent communication skills are a must for this position
- Experience with Unix/Linux and shell scripting
Preferred qualifications, capabilities and skills:
- Good understanding of data modelling challenges with big data
- Good understanding of Financial data especially in front office investment banking is a major plus
- Ability to code in Apache Spark using Scala is an added advantage