Job responsibilities
- Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
- Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems
- Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development
- Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems
- Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture
- Contributes to software engineering communities of practice and events that explore new and emerging technologies
- Adds to team culture of diversity, equity, inclusion, and respect
Required qualifications, capabilities, and skills
- Formal training or certification on software engineering concepts and 3+ years applied experience
- Hands on experience of Databricks and working with Python/Java, PySpark etc. and knowledge of AirFlow preferred.
- Hands-on practical experience in system design, application development, testing, and operational stability
- Experience in developing, debugging, and maintaining code in a large corporate environment with one or Knowledge of multiple RDBMS, Data-warehouses. Expert scripting knowledge (SQL/Shell/Perl).
- Demonstrable ability to code in one or more languages
- Experience across the Spring boot / Micro Services / Kafka / Cassandra & API development.
- Experience in CI/CD (Jenkins). Advanced SQL (e.g., joins and aggregations) and good understanding of NoSQL databases,
- Emerging knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.)
Preferred qualifications, capabilities, and skills
- Familiarity with Databricks Lakehouse, Delta Lake, Delta live Tables (DLT).
- Exposure to cloud technologies