המקום בו המומחים והחברות הטובות ביותר נפגשים
You will have track record of success in delivering new products, solving problems, and learning new technologies quickly. A commitment to teamwork, proactive approach to solving problems, and strong verbal and written communication skills are essential. Creating reliable, scalable, and high-performance products requires technical expertise, understanding of computer science fundamentals, and practical experience building efficient large-scale systems. This person is comfortable delivering quality solutions in a fast-growing environment where priorities may change rapidly.
Key job responsibilities
* Design and implement high-throughput, cost-effective data pipelines to extract, transform, and load (ETL) data and facts from structured and semi-structured knowledge sources.
* Develop and optimize state-of-the-art streaming algorithms to process large datasets in real-time, including tasks such as deduplication, topic clustering, and entity resolution.
* Build, extend, and maintain an existing codebase while also designing and developing new software components.
* Serve as technical lead for all stages of the software development cycle, including designing and developing new system architecture and improvements
* Participate in prioritization, estimation, and sprint planning
* Work in an Agile/Scrum environment to deliver high quality software against aggressive schedules
- Bachelor's degree in computer science or equivalent
- 5+ years of non-internship professional software development experience
- 5+ years of programming with at least one software programming language experience
- 5+ years of leading design or architecture (design patterns, reliability and scaling) of new and existing systems experience
- Experience as a mentor, tech lead or leading an engineering team
- 5+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience
- Expertise in data processing frameworks such as Apache Spark, Kafka, or similar
- Experience with knowledge graph technologies, semantic web, or graph databases (e.g., Neo4j, RDF, SPARQL)
- Familiarity with machine learning and natural language processing (NLP) techniques
משרות נוספות שיכולות לעניין אותך