Major Responsibilities:- Design, develop, and operate high-scalable, high-performance, low-cost, and accurate data pipelines in distributed data processing platforms
- Recognize and adopt best practices in data processing, reporting, and analysis: data integrity, test design, analysis, validation, and documentation
- Keep up to date with big data technologies, evaluate and make decisions around the use of new or existing software products to design the data architecture
- Design, build and own all the components of a high-volume data warehouse end to end.
- Provide end-to-end data engineering support for project lifecycle execution (design, execution and risk assessment)
- Own the functional and nonfunctional scaling of software systems in your ownership area.
- Implement big data solutions for distributed computing.Key job responsibilities
As a DE on our team, you will be responsible for leading the data modelling, database design, and launch of some of the core data pipelines. You will have significant influence on our overall strategy by helping define the data model, drive the database design, and spearhead the best practices to delivery high quality products.
We push the envelope in using cloud services in AWS as well as the latest in distributed systems, forecasting algorithms, and data mining.
- 3+ years of data engineering experience
- 4+ years of SQL experience
- Experience with data modeling, warehousing and building ETL pipelines
משרות נוספות שיכולות לעניין אותך