The point where experts and best companies meet
Share
Key job responsibilities
- Develop data products, infrastructure and data pipelines leveraging AWS services (such as Redshift, Kinesis, EMR, Lambda etc.) and internal BDT tools (DataNet, Cradle, Quick Sight etc.)
- Improve existing solutions and come up with next generation Data Architecture to improve scale, quality, timeliness, coverage, monitoring, and security.
- Stay up-to-date with the latest advancements in AI/LLM technology and its applications in data engineering to identify opportunities for automation and process improvement.Seattle, WA, USA
- 5+ years of data engineering experience
- Experience with data modeling, warehousing and building ETL pipelines
- Experience with SQL
- Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS
- Experience mentoring team members on best practices
- Experience with big data technologies such as: Hadoop, Hive, Spark, EMR
- Experience operating large data warehouses
These jobs might be a good fit