Expoint – all jobs in one place
Finding the best job has never been easier
Limitless High-tech career opportunities - Expoint

JPMorgan Software Engineer III - Databricks ETL 
United States, New Jersey, Jersey City 
83717945

09.09.2025

Job Responsibilities

  • Design, develop, and maintain scalable data pipelines and ETL processes to support data integration and analytics.
  • Frequently utilizes SQL and understands NoSQL databases and their niche in the marketplace
  • Implement best practices for data engineering, ensuring data quality, reliability, and performance
  • Contribute to data modernization efforts by leveraging cloud solutions and optimizing data processing workflows
  • Perform data extraction and implement complex data transformation logic to meet business requirements
  • Monitor and executes data quality checks to proactively identify and address anomalies
  • Ensure data availability and accuracy for analytical purposes
  • Identify opportunities for process automation within data engineering workflows
  • Deploy and manage containerized applications using Amazon ECS ( Kubernetes EKS) .
  • Implement data orchestration and workflow automation using AWS step , Event Bridge
  • Use Terraform for infrastructure provisioning and management, ensuring a robust and scalable data infrastructure.

Required qualifications, capabilities, and skills

  • Formal training or certification on Software / Data Engineering concepts and 3+ years applied experience
  • Experience across the data lifecycle
  • Experience working with modern Lakehouse : Databricks , Glue )
  • Proficient in SQL coding (e.g., joins and aggregations)
  • Experience in Micro service based component using ECS or EKS
  • Experience in building and optimizing data pipelines, architectures, and data sets ( Glue or Data bricks etl)
  • Proficient in object-oriented and object function scripting languages (Python etc.)
  • Experience in developing ETL process and workflows for streaming data from heterogeneous data sources ( Kafka)
  • Experience building Pipeline on AWS using Terraform and using CI/CD pipelines


Preferred qualifications, capabilities, and skills

  • Advanced knowledge of RDBMS like Aurora , Open Search
  • Experience with data pipeline and workflow management tools (Airflow, etc.)
  • Strong analytical and problem-solving skills, with attention to detail.
  • Ability to work independently and collaboratively in a team environment.
  • A proactive approach to learning and adapting to new technologies and methodologies.