Expoint - all jobs in one place

Finding the best job has never been easier

Limitless High-tech career opportunities - Expoint

JPMorgan Lead Software Engineer - Data modeling / governance 
United States, Texas, Plano 
232741497

08.04.2025

Job responsibilities

  • Lead the design and implementation of data models and governance frameworks to ensure data quality, consistency, and security.
  • Develop data architecture solutions for machine learning engines and analytics, and oversee data migration to AWS, utilizing services like Athena.
  • Collaborate with cross-functional teams to define data requirements and develop architecture solutions that support business objectives.
  • Execute software solutions, design, and data analysis, ensuring secure, high-quality data models and production code.
  • Produce architecture and design artifacts for complex applications, ensuring design constraints are met.
  • Gather, analyze, and visualize data to drive continuous improvement of software applications and systems.
  • Develop and maintain APIs using Python for data integration and create interactive dashboards with Tableau.
  • Proactively identify hidden problems in data and improve coding hygiene and system architecture.
  • Contribute to software engineering communities and promote a culture of diversity, equity, and inclusion.

Required qualifications, capabilities, and skills

  • Formal training or certification on software engineering concepts and 5+ years applied experience
  • Minimum 5 years of data modeling, data architecture and data governance experience including Big Data, Relational Databases, and AWS or other Cloud based data architecture
  • Hands on experience with ERwin or similar Data Modeling tool
  • In-Depth knowledge of data privacy regulations and compliance requirements
  • Experience with PCI, PII, Data Classification
  • Strong understanding of relational databases, data warehousing, and data architecture for machine learning inputs and outputs
  • Familiarity with modern front-end technologies and cloud technologies
  • Minimum 3 years of experience with Python, Spark, and AWS, including full development lifecycle
  • Experience with Hadoop ecosystem technologies (HDFS, HBase, Hive, Pig, Spark, MapReduce, Cloudera)
  • Hands-on experience in system design, application development, testing, and operational stability
  • Solid understanding of agile methodologies, JIRA, SCRUM, KANBAN CI/CD, application resiliency, Bitbucket, and security
Preferred qualifications, capabilities, and skills
  • Experience with Cassandra, Kafka, and other document database/distributed event streaming platform preferred
  • Certification in AWS or related technologies preferred
  • Understanding of Eclipse/IntelliJ, Maven, Jenkins, GIT, Control M, or equivalent tools
  • Experience and understanding of Java and object oriented languages/methodologies preferred