The point where experts and best companies meet
Share
Key job responsibilities
- Collaborate with finance and business stakeholders to understand requirements and translate them into technical specifications.
- Design and develop scalable big data pipelines to ingest, transform, and publish large volumes of data efficiently.
- Troubleshoot system and data quality issues
- Identify bottlenecks with current architecture and propose efficient and long term solutions- Contribute to system documentation and on call runbooks
- Mentor junior engineers and drive the successful implementation of projects.A day in the life
- 3+ years of data engineering experience
- Experience with big data technologies such as: Hadoop, Hive, Spark, EMR
- Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets
- Experience with data modeling, warehousing and building ETL pipelines
- Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS
- Experience with SQL
- Bachelor's degree
These jobs might be a good fit