Required qualifications, capabilities, and skills
- Formal training or certification on software engineering concepts and 3+ years of applied experience.
- Hands-on practical experience in system design, application development, testing and operational stability.
- Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying language.
- Hands-on practical experience in developing spark-based Frameworks for end-to-end ETL, ELT & reporting solutions using key components like Spark SQL & Spark Streaming.
- Proficient in coding in one or more Coding languages - Java, Scala, Python,
- Experience with Relational and No SQL databases,
- Cloud implementation experience with AWS including:
- AWS Data Services: Proficiency in Lake formation, Glue ETL (or) EMR, S3, Glue Catalog, Athena, Kinesis (or) MSK, Airflow (or) Lambda + Step Functions + Event Bridge
- Data De/Serialization: Expertise in at least 2 of the formats: Parquet, Iceberg, AVRO, JSON-LD
- AWS Data Security: Good Understanding of security concepts such as: Lake formation, IAM, Service roles, Encryption, KMS, Secrets Manage
Preferred qualifications, capabilities, and skills
- Proficiency in automation and continuous delivery methods.
- Experience in Snowflake nice to have.
- Proficient in all aspects of the Software Development Life Cycle.
- Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security.
- In-depth knowledge of the financial services industry and their IT systems. Practical cloud native experience preferably AWS.