Job responsibilities
- Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems.
- Creates secure and high-quality production code.
- Produces architecture and design artifacts for complex applications.
- Proactively identifies hidden problems, patterns in data, and uses these insights to drive improvements to coding hygiene and system architecture.
- Contributes to software engineering communities of practice and events that explore new and emerging technologies.
Required qualifications, capabilities, and skills
- Formal training or certification on software engineering concepts and 3+ years of applied experience
- Hands-on of Java & Spark experience in ETL domain.
- Strong knowledge of Spring framework, REST API, Design Patterns, Event driven design principles and Database Queries.
- Hands on experience building microservices , and cloud-native application development experience on AWS (Step functions, Dynamo DB Lambda, S3, RDS Aurora, EC2)
- Hands-on practical experience in system design, application development, testing, and operational stability
- Experience in developing, debugging, Reverse Engineering, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages
- Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and Security
- Primary skills will be Java, Spark, REST API’s, AWS cloud tech stack (S3, ECS/EKS, DyanmoDb, step functions, Lambda), Teraform
- Big Data engineering and processing, data lakes, and data warehouse patterns
Preferred qualifications, capabilities, and skills
- Familiarity with modern front-end technologies
- Familiar with Databricks is highly desirable, AWS Data Lake formation would be beneficial (if not hands on with Databricks)
- Exposure to data lake and data engineering products