The point where experts and best companies meet
Share
This role requires an individual with excellent problem-solving skills, understanding of how Machine Learning models work, expertise in working with streaming data, deep knowledge of data pipeline engineering using next-gen AWS data technologies and an understanding of business intelligence solutions.
You will be responsible for designing and implementing deployment dependencies for Machine Learning models, including feature engineering, model hosting and integration mechanisms. You will also be responsible for designing and implementing complex ETL pipelines in data warehouse platform and other BI solutions to support the rapidly growing and dynamic business demand for data, and use it to deliver the data as a service which will have an immediate influence on day-to-day decision making at Amazon.Key job responsibilities
- You help build the infrastructure to answer questions with data, using software engineering best practices, data management fundamentals, data storage principles, and recent advances in distributed systems- You help drive the architecture and technology choices that enable a world-class user experience- You manage AWS resources- You uphold Operational Excellence best practices and actively contribute to evolve set standards.- You encourage the organization to adopt next-generation data architecture strategies, proposing both data flows and storage solutions- You are comfortable presenting your findings to large groupsA day in the life1. Medical, Dental, and Vision Coverage
2. Maternity and Parental Leave Options
3. Paid Time Off (PTO)
4. 401(k) PlanSeattle, WA, USA
- 5+ years of data engineering experience
- Experience with data modeling, warehousing and building ETL pipelines
- Experience as a Data Engineer or in a similar role
- Experience programming with at least one modern language such as C++, C#, Java, Python, Golang, PowerShell, Ruby
- Bachelor's Degree
- Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets
- Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions
- Knowledge of professional software engineering & best practices for full software development life cycle, including coding standards, software architectures, code reviews, source control management, continuous deployments, testing, and operational excellence
These jobs might be a good fit