The core competencies required for this role include -
Bachelor’s degree in computer science engineering
3+ years of hands-on experience in data engineering field
In depth big data tech stack knowledge
Expertise in pyspark and SQL
Expertise in databricks, snowflake, airflow
Excellent written and verbal communication skills
On a day-to-day basis, you'll focus on -
Building, enhancing, and troubleshooting complex data pipelines
Collaborating with product managers, engineers, analysts to build, enhance and troubleshoot data pipelines
Collaborate with senior, lead and principal engineers to define and implement quality standards across data pipelines
Contribute towards the design and architecture of data pipelines
Implement data quality and reliability measures across data pipelines
משרות נוספות שיכולות לעניין אותך