Job responsibilities
- Design & build new applications utilizing leading edge technologies and modernize existing applications
- Implement batch & real-time software components consistent with architectural best-practices of reliability, security, operational efficiency, cost-effectiveness and performance
- Ensure quality of deployed code via automated unit, integration & acceptance testing
- Collaborate with multi-national agile development, support and business teams to meet sprint objectives
- Participate in all agile meetings & rituals, including daily standups, sprint planning, backlog reviews, demos, and retrospectives
- Provide level 2 support for production systems
- Learn and apply new processes, tools & technologies for personal & team growth and to continuously improve the team's products
- Add to team culture of diversity, equity, inclusion, and respect
Required qualifications, capabilities, and skills
- Formal training and certification on software engineering concepts and 3+ years applied experience
- Strong experience in Big Data development & ETL data pipeline implementation using Apache Spark
- Experience provisioning and tuning AWS infrastructure for ETL such as EMR, S3, Glue and Athena
- Experience designing, developing and deploying solutions on AWS using services such as EC2, EKS, Aurora, SQS and MSK
- Must demonstrate strong analytics and troubleshooting skills
Preferred qualifications, capabilities, and skill
- Certified AWS Developer, Solutions Architect or Data Engineer strongly preferred
- Experience coding Java applications using Spring Boot
- Experience using Terraform to deploy infrastructure-as-code to public cloud
- Experience with Linux scripting such as Bash, KSH, or Python