המקום בו המומחים והחברות הטובות ביותר נפגשים
Job Duties: Develop and improve high-quality risk infrastructural system around risk data and provide accurate data and decisions to deliver customized software solutions to fit its specific needs. Review the current system’s architecture and identify areas of potential improvement. Collaborate with the technical team to design new modules or enhancements. Write and test code for the new modules or enhancements. System testing and bug fixing. Oversee the deployment of enhanced systems into production. Build solutions to collect and analyze the vast amounts of eBay transactional and behavioral data, to spot trends, identify potential risks and make informed decisions. Develop, design systems to collect and analyze data from various sources. Identify patterns and insights that can lead to improved risk management. Integrate strategies into the risk management systems. Meet with stakeholders to understand software requirements. Analyze the requirements to design system blueprints. Share designs with the team and incorporate feedback. Finalize the system architecture and get approval from stakeholders. Document the design for future reference. Document all processes, methodologies, and systems used. Partial telecommuting permitted from within a commutable distance.
Minimum Requirements: Master’s degree, or foreign equivalent, in Computer Science, Engineering, or a closely related field plus three years of experience in the job offered or a related occupation.
Special Skill Requirements:
1. Experience designing systems that can process billions of request in low latency fashion (3 years).
2. Experience using advanced features of Java and threads, understanding garbage collection, and designing using Object Oriented concepts and design patterns (3 years).
3. Experience with trades-offs between various databases and experience in relational database and No-SQL (3 years).
4. Experience using Kafka producers and varied types of Kafka consumer to aggregate data over large streams (3 years).
5. Experience running Hadoop map reduce queries to perform analytics on the features created (3 years).
6. Experience in running the Spark queries in optimized fashion to scale over large data sets using Spark utilities and functions (3 years).
7. Experience building UI tools and solutions rapidly using React (3 years).
8. Experience in Javascript (3 years).
9. Experience implementing large distributed systems where functionality is built over micro services and event processor and scales for billions of request (3 years).
10. Experience in architecture and implementation of rules engines (3 years).
11. Experience extracting metadata and modelling metadata to improve configurability of the systems (3 years).
12. Experience writing Shell Scripts to automate small tasks like text parsing of logs (2 years).
Base salary: $184,662-234,850 per annum. 40 hours per week; M-F, 9:00 a.m. to 5:00 p.m.
Must be legally authorized to work in the U.S. without sponsorship.
This website uses cookies to enhance your experience. By continuing to browse the site, you agree to our use of cookies. Visit our for more information.
משרות נוספות שיכולות לעניין אותך