מציאת משרת הייטק בחברות הטובות ביותר מעולם לא הייתה קלה יותר
Key job responsibilities
- Build end-to-end data pipelines to ingest and transform data from different types of data sources and systems; from traditional ETL pipelines to event data streams
- Utilize data from disparate data sources to build meaningful datasets for analytics and reporting
- Evaluate and implement various big-data technologies and solutions (e.g. Redshift, Hive/EMR, Spark, SNS, SQS, Kinesis) to optimize processing of extremely large datasets- Write high performing and optimized SQL queries
- Design and implement automated data processing solutions and data quality controls
- Bachelor's degree
- Experience as a Data Engineer or in a similar role
- Experience with data modeling, warehousing and building ETL pipelines
- Experience with SQL
- Experience working on and delivering end to end projects independently
- Knowledge of professional software engineering & best practices for full software development life cycle, including coding standards, software architectures, code reviews, source control management, continuous deployments, testing, and operational excellence
- Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS
משרות נוספות שיכולות לעניין אותך