Finding the best job has never been easier
Share
Sales, Marketing and Global Services (SMGS)Key job responsibilities- Excellent business and interpersonal skills to be able to work with business owners to understand data requirements, and to build ETL to ingest the data into the data lake.
- Be an authority at crafting, implementing, and operating stable, scalable, low cost solutions to flow data from production systems into the data lake.
About the team
Diverse Experiences
AWS values diverse experiences. Even if you do not meet all of the qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying.Mentorship & Career Growth
We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional.Work/Life Balance
- This position requires a Bachelor's Degree in Computer Science or a related technical field, and 3+ years of related employment experience.
- 3+ years of work experience with ETL, Data Modeling, and Data Architecture.
- Expert-level skills in writing and optimizing SQL.
- Experience with Big Data technologies such as Hadoop, Hive, Spark.
- Proficiency in one of the scripting languages - python, ruby, linux or similar.
- Experience working in very large data warehouses or data lakes.
- Sound knowledge and hands-on experience in ETL optimization, designing, coding, and tuning big data processes using Apache Spark or similar technologies.
- Experience with building data pipelines and applications to stream and process datasets at low latencies.
- Demonstrate efficiency in handling data tracking, lineage, ensuring data quality, and improving discoverability of data.
- Sound knowledge of distributed systems and data architecture (lambda)- design and implement batch and stream data processing pipelines, knows how to optimize the distribution, partitioning in MPP architecture.
- Knowledge of Engineering and Operational Excellence using standard methodologies.
These jobs might be a good fit