מציאת משרת הייטק בחברות הטובות ביותר מעולם לא הייתה קלה יותר
Key responsibilities include:- Designing, building, and maintaining scalable, automated, and fault-tolerant data pipelines using Spark, EMR, Redshift, Glue, S3, and Python.- Simplifying complex datasets and enhancing data accessibility through innovative BI solutions.
- Bachelor's degree in quantitative study such as Computer Science, Computer Engineering, Mathematics, or equivalent experience
- 3+ years of data engineering experience
- 3+ years of writing advanced SQL queries
- Experience with data modeling, warehousing, and building ETL pipelines
- Experience in at least one modern scripting or programming language, such as Python, Java, Scala, R, or NodeJS
- Proficiency in written and verbal communication skills including comfortability communicating and presenting to senior leadership
משרות נוספות שיכולות לעניין אותך