Expoint – all jobs in one place
המקום בו המומחים והחברות הטובות ביותר נפגשים
Limitless High-tech career opportunities - Expoint

JPMorgan Software Engineer III - ETL Informatica Developer IDMC 
India, Karnataka, Bengaluru 
323945796

15.07.2025


Job responsibilities

  • Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
  • Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems
  • Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development
  • Gathers, analyzes, synthesizes, and develops visualizations and reporting for large, diverse data sets in service of continuous improvement of software applications and systems
  • Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture
  • Contributes to software engineering communities of practice and events that explore new and emerging technologies
  • Adds to team culture of diversity, equity, inclusion, and respect

Required qualifications, capabilities, and skills

  • Formal training or certification on Unix, Oracle (both SQL & PL-SQL) concepts and 3+ years applied experience
  • Expertise in Informatica PowerCenter and Informatica Intelligent Data Management Cloud (IDMC) for ETL and data integration.
  • Knowledge of data warehousing concepts, schema design, and performance tuning for ETL processes.
  • Strong expertise in databases, including SQL and PL/SQL skills for querying, data manipulation, and managing database objects.
  • Proficiency with core AWS services (EC2, S3, Lambda, Glue) for data storage, orchestration, and processing.
  • Solid understanding of the Databricks platform, including key features, workspace management, and best practices for scalable data analytics.
  • Understanding of job scheduling tools (such as AutoSys) to automate and monitor ETL processes.
  • Decent exposure to tools like JIRA, Confluence, Service Now etc
  • Strong problem-solving, effective communication, and documentation skills for collaboration with cross-functional teams.
  • Python coding skills for data processing, automation, and custom transformations .
  • Hands-on experience with Databricks and Spark for large-scale data processing and analytics, including writing and optimizing Spark/SQL queries.

Preferred qualifications, capabilities, and skills


* Any exposure on Apache Ni-Fi would be an added advantage