Job responsibilities
- Executes software solutions, design, development, and technical troubleshooting with the ability to think beyond routine or conventional approaches to build solutions or break down technical problems
- Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems
- Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development
- Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems.
- Designs and delivers trusted data collection, storage, access, and analytics data platform solutions in a secure, stable, and scalable way
- Defines database back-up, recovery, and archiving strategy
- Design and develop data pipelines to ingest, store, and process data from multiple sources
- Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture.
- Contributes to software engineering communities of practice and events that explore new and emerging technologies
- Adds to team culture of diversity, equity, inclusion, and respect
Required qualifications, capabilities, and skills
- Formal training or certification in software engineering concepts and 3+ years applied experience
- Experience with AWS cloud technologies, including S3
- Experience with SQL-based technologies (e.g., MySQL/ Oracle DB)
- Experience in Java or Python programming language
- Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages
- AWS Data Services: Proficiency in Lake formation, Glue ETL (or) EMR, S3, Glue Catalog, Athena, Kinesis (or) MSK, Airflow (or) Lambda + Step Functions + Event Bridge
- Data De/Serialization: Expertise in at least 2 of the formats: Parquet, Iceberg, AVRO, JSON-L
- AWS Data Security: Good Understanding of security concepts such as: Lake formation, IAM, Service roles, Encryption, KMS, Secrets Manager
- Experience with statistical data analysis and ability to determine appropriate tools and data patterns to perform analysis
- Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.)
Preferred qualifications, capabilities, and skills
- Snowflake knowledge or experience preferred
- In-depth knowledge of the financial services industry and their IT systems
- Worked with building Data lake, built Data platforms, built Data frameworks, Built/Design of Data as a Service AP