מציאת משרת הייטק בחברות הטובות ביותר מעולם לא הייתה קלה יותר
Key job responsibilities
- Architect, develop, and maintain scalable data infrastructure and data pipelines.
- Extract, transform, and load data from a variety of sources using SQL, scripting, and AWS big data technologies.
- Design and implement ETL pipelines and BI solutions.- Stay up to date with advancements in big data technologies and contribute to pilot projects.
- 1+ years of data engineering experience
- Experience with data modeling, warehousing and building ETL pipelines
- Experience with SQL
- Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala)
- Experience with one or more scripting language (e.g., Python, KornShell)
- Experience with big data technologies such as: Hadoop, Hive, Spark, EMR
- Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc.
משרות נוספות שיכולות לעניין אותך