The point where experts and best companies meet
Share
Job Description:
Job Duties: Develop large scale and distributed fault tolerant enterprise software applications using Hadoop and related Big Data Technologies such as Spark, Spark Streaming, Hive, HBase, YARN, GCP, BigQuery and BigTable. Build and implement REST API/services using Spring framework which reads the data from SOR, performs business logic, and expose the software output data to client in REST API. Design and develop logical and physical data models for Pricing Insights Data platform. Build high throughput event driven streaming software data pipeline. Create and maintain the Spring boot batch software applications. Translate software product requirement into functional and business logic. Schedule software jobs using UC4, Apache Airflow. Participate in software code reviews, design reviews, and internal testing sign off for deliverables. Partial telecommuting permitted from within a commutable distance.
Minimum Requirements: Master’s degree, or foreign equivalent, in Computer Science, Engineering, or a closely related field plus two years of experience in the job offered or a related occupation. Employer will accept a Bachelor’s degree, or foreign equivalent, in Computer Science, Engineering or a closely related field plus five years of experience in the job offered or a related occupation.
Special Skill Requirements:
Two years of experience required in the following skills:
1.Java 8
2.Hadoop
3.Hive
4.Spark
5.Spring Boot
6.Rest API
7.Distributed Systems
One year of experience required in the following skills:
8.GCP
9.BigQuery
10.BigTable
11.Dataproc
Must be legally authorized to work in the U.S. without sponsorship.
Our Benefits:
Any general requests for consideration of your skills, please
These jobs might be a good fit