Job responsibilities
- Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
- Develops secure high-quality production code, and reviews and debugs code written by others
- Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems
- Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture
- Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies
- Adds to team culture of diversity, equity, inclusion, and respect
Required qualifications, capabilities, and skills
- Formal training or certification on software engineer concepts and 5+ years applied experience
- Experience in building data pipelines using Apache Spark, with proficiency in Scala, Python, utilizing platforms such as AWS EMR or Databricks
- Hands-on experience with IaC tools like Terraform
- Knowledge of open table formats such as Iceberg and Delta Lake, along with expertise in AWS Glue Data Catalog and fine-grained access control.
- Advanced in one or more programming language(s) (Python, Java, etc.)
- Proficiency in automation and continuous delivery methods
- Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security
- Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.)
Preferred qualifications, capabilities, and skills
- Extensive hands-on experience with AWS Cloud services including S3, IAM, EMR, Glue, ECS/EKS, and Athena.
- Certifications in AWS, Databricks, and Terraform are highly desirable.
- 1+ years of experience implementing data pipelines using Databricks, including tools such as Unity Catalog, Databricks Workflows, and Databricks Live Table, along with experience