מציאת משרת הייטק בחברות הטובות ביותר מעולם לא הייתה קלה יותר
In this critical role, you will collaborate closely with Industry Specialists, Software Engineers, Security Engineers, and Product Managers to understand data requirements. You will architect data models, build ETL/ELT pipelines, and implement data distribution layers that enable advanced security use cases like threat detection, incident response, forensic analysis and data exploration. With your keen data engineering skills, you will empower AWS SOC to derive insights from our security telemetry data and drive continuous improvement of Amazon's security posture.Key job responsibilities
• Architect and build scalable data pipelines for ingesting, processing and serving security data from diverse sources across Amazon
• Design and implement data models, storage solutions and distributed processing frameworks for security analytics workloads
• Collaborate with data scientists to build features, transformations and serving layers for machine learning models
• Optimize data engineering systems for performance, cost, reliability, recoverability and security
• Automate data quality monitoring, testing and validation processes
• Partner with software engineering teams to seamlessly integrate data solutions with security applications and servicesDiverse Experiences
Amazon Security values diverse experiences. Even if you do not meet all of the qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying.Training & Career Growth
We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, training, and other career-advancing resources here to help you develop into a better-rounded professional.Work/Life Balance
- 5+ years of data engineering experience
- Experience with data modeling, warehousing and building ETL pipelines
- Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS
- Knowledge of distributed systems as it pertains to data storage and computing
- Experience with big data technologies such as: Hadoop, Hive, Spark, EMR
- Experience building production data pipelines and processing frameworks on AWS
- Experience operating large data warehouses
- Experience communicating with users, other technical teams, and management to collect requirements, describe data modeling decisions and data engineering strategy
- Master's degree in computer science, engineering, analytics, mathematics, statistics, IT or equivalent
- Experience with MPP databases such as Amazon Redshift
- Experience with SQL
- Experience in a security operations, cybersecurity or incident response environment
- Familiarity with stream processing engines like Spark Streaming, Flink
- Knowledge of machine learning/deep learning frameworks and feature engineering
- Experience with DataOps practices and data quality frameworks
משרות נוספות שיכולות לעניין אותך