Expoint - all jobs in one place

Finding the best job has never been easier

Limitless High-tech career opportunities - Expoint

Intercontinental Exchange - ICE Developer 
India, Maharashtra, Pune 
170868806

14.04.2025
Job Description


Responsibilities

  • Develop high quality data processing infrastructure and scalable services that are capable of ingesting and transforming data at huge scale coming from many different sources on schedule.
  • Turn ideas and concepts into carefully designed and well-authored quality code.
  • Articulate the interdependencies and the impact of the design choices.
  • Develop APIs to power data driven products and external APIs consumed by internal and external customers of data platform.
  • Collaborate with QA, product management, engineering, UX to achieve well groomed, predictable results.
  • Improve and develop new engineering processes & tools.

Knowledge and Experience

  • 3+ years of building Enterprise Software Products.
  • Experience in object-oriented design and development with languages such as Java. J2EE and related frameworks.
  • Experience building REST based micro services in a distributed architecture along with any cloud technologies. (AWS preferred)
  • Knowledge in Java/J2EE frameworks like Spring Boot, Microservice, JPA, JDBC and related frameworks is must.
  • Built high throughput real-time and batch data processing pipelines using Kafka, on AWS environment with AWS services like S3, Kinesis, Lamdba, RDS, DynamoDB or Redshift . (Should know basics atleast)
  • Experience with a variety of data stores for unstructured and columnar data as well as traditional database systems, for example, MySQL, Postgres
  • Proven ability to deliver working solutions on time
  • Strong analytical thinking to tackle challenging engineering problems.
  • Great energy and enthusiasm with a positive, collaborative working style, clear communication and writing skills.
  • Experience with working in DevOps environment – “you build it, you run it”
  • Demonstrated ability to set priorities and work in a fast-paced, dynamic team environment within a start-up culture.
  • Experience with big data technologies and exposure to Hadoop, Spark, AWS Glue, AWS EMR etc (Nice to have)
  • Experience with handling large data sets using technologies like HDFS, S3, Avro and Parquet (Nice to have)