Expoint – all jobs in one place
Finding the best job has never been easier
Limitless High-tech career opportunities - Expoint

JPMorgan Lead Data Engineer - Payments Technology 
United States, Illinois, Chicago 
765885967

Yesterday

Job responsibilities

  • Designs and delivers trusted data collection, storage, access, and analytics data platform solutions in a secure, stable, and scalable way
  • Defines database back-up, recovery, and archiving strategy
  • Generates advanced data models for one or more teams using firmwide tooling, linear algebra, statistical and geometrical algorithms
  • Delivers data pipeline/architecture solutions that can be leveraged across multiple businesses/domains
  • Influences peer leaders and senior stakeholders across the business, product, and data technology teams
  • Provides recommendations and insight on data management and governance procedures and intricacies applicable to the acquisition, maintenance, validation, and utilization of data
  • Identify opportunities for process improvements and operational efficiencies
  • Creates data models for complex applications and integrations while being accountable for ensuring design constraints are met by data engineering standards and software code development
  • Oversee the design, development, and implementation of data solutions using Databricks and AWS Glue.
  • Ensure the scalability, reliability, and performance of data pipelines and infrastructure.

Required qualifications, capabilities, and skills

  • Formal training or certification on Data Engineering concepts and 5+ years applied experience.
  • Ability to guide and coach teams on approach to achieve goals aligned against set strategic initiatives
  • Practical SQL and NoSQL experience
  • Advanced understanding of database back-up, recovery, and archiving strategy
  • Expert knowledge of AI/ML models
  • Experience presenting and delivering visual data
  • Proficient in automation and continuous delivery methods including all aspects of the Software Development Life Cycle
  • Expert knowledge of a combination of PySpark, Databricks, AI/ML, Snowflake, Redshift, Data Lakes, Data products, Cloud Based Big data technologies and handling Metadata
  • Advanced understanding of agile methodologies such as CI/CD, Application Resiliency & Security

Preferred qualifications, capabilities, and skills

  • Applicable certifications are preferred