Expoint - all jobs in one place

The point where experts and best companies meet

Limitless High-tech career opportunities - Expoint

JPMorgan Software Engineer III - Hadoop 
India, Karnataka, Bengaluru 
719062327

11.03.2025

Job responsibilities

  • Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
  • Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems
  • Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development
  • Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems
  • Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture
  • Contributes to software engineering communities of practice and events that explore new and emerging technologies
  • Adds to team culture of diversity, equity, inclusion, and respect

Required qualifications, capabilities, and skills

  • Formal training or certification on software engineering concepts and 3+ years applied experience
  • Hands on experience, focusing on cloud and Big Data environments.
  • Experience in leading and managing software engineering teams with successful project delivery.
  • Proficiency in programming languages like Python or Java, with cloud-native development experience.
  • Expertise in AWS services (Data Lakes, EMR, EKS/ECS, Glue) and Hadoop ecosystem tools.
  • Experience with Infrastructure as Code (IaC) tools like Terraform for cloud resource management.
  • Strong understanding of SDLC, agile methodologies, CI/CD, and exposure to AI and machine learning applications.

Preferred qualifications, capabilities, and skills

  • Familiarity with Cloudera Data Platform
  • Familiarity with Databricks and Open Table formats like Iceberg, Delta Lake
  • AWS Cloud Certification will be plus
  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field.