Expoint - all jobs in one place

מציאת משרת הייטק בחברות הטובות ביותר מעולם לא הייתה קלה יותר

Limitless High-tech career opportunities - Expoint

JPMorgan Head Data - Principal Software Engineer 
United Kingdom, England, London 
769615822

01.09.2024

Job responsibilities

  • Architect and implementation: Design and develop scalable, cost-effective, and secure distributed architectures and solutions, utilizing appropriate GCP and AWS services and technologies.
  • Lead, manage & mentor a small team of data engineers to design, develop, and implement data platforms, pipelines, and infrastructure for our multi-cloud product across GCP and AWS.
  • Develop data pipelines: Design, implement, and maintain data pipelines that efficiently collect, process, and store large volumes of data from various sources, ensuring data quality and integrity.
  • Continuously monitor, analyze, and optimize data pipelines to improve performance, reduce costs, and ensure reliability and scalability.
  • Ensure that data solutions comply with relevant data security and privacy regulations, and implement best practices for securing data at rest and in transit.
  • Establish & govern a project wide data catalogue where we maintain the list of all data sets across the platform with the appropriate description of the data items

Required qualifications, capabilities, and skills

  • Formal training or certification on data engineering concepts and proficient expert experience
  • Actively writing code & proficiency in one or more languages commonly used in data engineering , such as Python, R, Java, or Scala. Strong problem-solving and critical-thinking skills, with the ability to break down complex problems and develop innovative solutions
  • Data engineering skills: Proficiency in designing, building, developing, and optimizing data pipelines, as well as experience with big data processing tools like Apache Spark, Hadoop, and Dataflow
  • Experience in designing & operating Operational Datastore/Data Lake/Data Warehouse platforms at scale with high-availability
  • Data integration: Familiarity with data integration tools and techniques, including ETL (Extract, Transform, Load) processes and real-time data streaming (e.g., using Apache Kafka, Kinesis, or Pub/Sub), exposing data sets via GraphQL
  • Cloud platforms expertise: Deep understanding of GCP/AWS services, architectures, and best practices, with hands-on experience in designing and implementing scalable and cost-effective solutions.
  • Data storage and databases: Knowledge of various data storage options (e.g., relational databases, NoSQL, data lakes) and hands-on experience with managing and optimizing databases such as PostgreSQL, MySQL, BigQuery, and Redshift
  • Leadership and communication: Ability to lead a team of engineers, collaborate effectively with cross-functional teams, and communicate complex technical concepts to both technical and non-technical stakeholders.
  • A desire to teach others and share knowledge. We aren’t looking for hero developers, more for team players. We want you to coach other team members on coding practices, design principles, and implementation patterns
  • Comfortable in uncharted waters. We are building something new. Things change quickly. We need you to learn technologies and patterns quickly
  • Ability to see the long term. We don’t want you to sacrifice the future for the present. We want you to choose technologies and approaches based on the end goals.