Job responsibilities
- Implements and delivers engineering solutions/tools and data products using cloud and on-premises data, distributed computing and emerging technologies.
- Develops secure and high-quality production code, and reviews and debugs code written by others.
- Contributes through the full software development lifecycle including architecture, proofs of concept, prototyping, development, rollout and support.
- Prioritize solving customer requests and issue reports, participate in support coverage
- Actively contributes to the engineering community as an advocate of firmwide frameworks, tools, and practices of the Software Development Life Cycle
- Adds to the team culture of diversity, equity, inclusion, and respect
Required qualifications, capabilities, and skills
- Formal training or certification on Computer Science / Engineering concepts and 5+ years hands on professional experience in building complex software systems in both private and public cloud environments (AWS)
- Degree in Computer Science, Computer Engineering, Mathematics, or a related technical field
- Advanced knowledge of software applications and technical processes with considerable in-depth knowledge of application, data, cloud and infrastructure architecture disciplines.
- Advanced skill in Java and Python.
- Strong hands-on experience with PostgreSQL, AWS RedShift, Athena, Glue, Kafka and NoSQL (Cassandra, MongoDB).
- Experience with fundamental DevOps practices including CI/CD.
- Ability to tackle design and functionality problems independently. A self-starter and thrive in a fast-paced, agile setting.
- Have clear and effective verbal and written communication skills and ability to communicate seamlessly across tech and data scientist teams
Preferred qualifications, capabilities, and skills
- Excellent problem-solving and analytical skills
- Containers and cloud proficiency including Docker, Kubernetes, AWS
- Hands on experience in building ETL/Data Pipeline and data lake platforms (e.g. Databricks, Spark/Hadoop, and Snowflake)
- Knowledge of workflow orchestration tools (e.g. Apache Airflow), integration technologies (e.g. GraphQL, REST)
- Experience building, deploying Machine Learning models, and the ML Lifecycle