מציאת משרת הייטק בחברות הטובות ביותר מעולם לא הייתה קלה יותר
Key job responsibilities
- Work with engineering and business stakeholders to understand data requirements
- Lead the design, model, and implementation of large, evolving, structured and unstructured datasets
- Evaluate and implement efficient distributed storage and query techniques- Implement robust and maintainable code with clear and maintained documentation
- Implement test automation on code implemented through unit testing and integration testingA day in the lifestay on top of latest trends in data warehousing and be able to coordinate and work on multiple,
related projects.
- Multiple years industry experience
- Proficiency in writing complex SQL with PostgreSQL, Redshift or other RDB
- Coding proficiency in at least one programming language: Python, Scala, Java
- Experience in data modelling, data warehousing and building ETL/ELT pipelines
- Experience with AWS, including Redshift, S3, RDS, Athena, Elastic MapReduce
- Data modelling experience and data warehouse technical architectures, reporting/analytic tools
- Degree in Computer Science, Engineering, Mathematics, or a related field
- Data Warehousing experience with Redshift, Teradata.
- Experience with workflow management platforms for data engineering pipelines (ex. Apache Airflow)
- Experience with Big Data Technologies (Spark, Hadoop, Hive, Pig, etc.)
- Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets
- Experience providing technical leadership and mentor other engineers for the best practices on the data engineering space
משרות נוספות שיכולות לעניין אותך