Job responsibilities
- Design & build new applications utilizing leading edge technologies and modernize existing applications
- Implement batch & real-time software components consistent with architectural best-practices of reliability, security, operational efficiency, cost-effectiveness and performance
- Ensure quality of deployed code via automated unit, integration & acceptance testing
- Collaborate with multi-national agile development, support and business teams to meet sprint objectives
- Participate in all agile meetings & rituals, including daily standups, sprint planning, backlog reviews, demos, and retrospectives
- Provide level 2 support for production systems
- Learn and applies system processes, methodologies, and skills for the development of secure, stable code and systems
- Add to team culture of diversity, equity, inclusion, and respect
Required qualifications, capabilities, and skills
- Formal training or certification on software engineering concepts and 2+ years applied experience
- Hands-on practical experience in Frameworks, system design, application development, testing, and operational stability
- Advanced in one or more programming language(s) (e.g., JAVA, and frameworks of Spring, Micro Services, APIs, etc.)
- Experience with Apache Spark or similar large-scale data processing engines
- Experience with Distributed Datastores (e.g. Cassandra)
- Proficiency in automation and continuous delivery methods
- Proficient in all aspects of the Software Development Life Cycle
- Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security
- Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, BigData, artificial intelligence, machine learning, mobile, etc.)
Preferred qualifications, capabilities, and skills
- Experience designing, developing and deploying software components on AWS using common compute and storage services such as EC2, EKS, Lambda, S3
- Experience with Big Data / Distributed / cloud technology (AWS Big data services like lambda, glue, glue emr and Spark Architecture, Performance tuning ,Spark SQL, Streaming, KAFKA, Entitlements etc., )
- Certified AWS Cloud Practitioner, Developer or Solutions Architect strongly preferred
- Experience using Terraform to deploy infrastructure-as-code to public cloud
- Experience with Linux scripting such as Bash, KSH, or Python