Expoint – all jobs in one place
מציאת משרת הייטק בחברות הטובות ביותר מעולם לא הייתה קלה יותר
Limitless High-tech career opportunities - Expoint

JPMorgan Software Engineer II-PySpark Developer/Data 
India, Telangana, Hyderabad 
795429818

29.05.2025

Job responsibilities

  • Executes standard software solutions, design, development, and technical troubleshooting
  • Writes secure and high-quality code using the syntax of at least one programming language with limited guidance
  • Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications
  • Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation
  • Applies technical troubleshooting to break down solutions and solve technical problems of basic complexity
  • Gathers, analyzes, and draws conclusions from large, diverse data sets to identify problems and contribute to decision-making in service of secure, stable application development
  • Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems
  • Adds to team culture of diversity, equity, inclusion, and respect

Required qualifications, capabilities, and skills

  • Formal training or certification on software engineering concepts and 2+ years applied experience
  • Hands-on practical experience in system design, application development, testing, and operational stability
  • Proven experience as a Data Engineer (atleast 3 years), with a focus on PySpark and big data technologies.
  • Proficiency in Python and PySpark for data processing and analysis.
  • Experience with Big data technologies on Cloud: AWS or other Cloud services.
  • Strong understanding of Databricks, SQL and experience with relational databases.
  • Knowledge of data warehousing concepts and ETL processes.
  • Monitor data pipelines for performance and reliability, and troubleshoot issues as they arise.
  • Optimize PySpark jobs for performance and scalability, including tuning Spark configurations and resource allocation
  • Experience across the whole Software Development Life Cycle

Preferred qualifications, capabilities, and skills

  • Familiarity with modern front-end technologies
  • Exposure to cloud technologies
  • Pyspark, SQL, AWS Cloud, Glue