מציאת משרת הייטק בחברות הטובות ביותר מעולם לא הייתה קלה יותר
*Role Impact*In this role, you'll contribute to transforming how businesses leverage data through AWS's distributed processing solutions. You'll collaborate with a diverse team to solve complex technical challenges and help customers optimize their big data implementations across multiple AWS services.*About the Team**Key Responsibilities*
. Develop and share technical solutions through various communication channels. Participate in mentoring and knowledge sharing initiatives
*A Day in the Life*
. Create technical documentation and educational content. Learn and implement new technologies
*Benefits to join this role*. Flexible hybrid work arrangement
. Comprehensive professional development. Work-life harmony
. Career growth opportunities**Schedule Note: This role follows a flexible 5-day work week schedule, which may include weekends on rotation.
- Good depth of understanding in Hadoop Administration, support and troubleshooting (Any two applications: Apache Spark, Apache Hive, Presto, Map-Reduce, Zookeeper, HBASE, HDFS and Pig.)
- Good understanding of Linux and Networking concepts
- Intermediate programming/scripting skills. Ideally in Java or Python, but will consider experience in other Object Oriented and Functional languages.
- Bachelor’s degree in Information Science / Information Technology, Computer Science, Engineering, Mathematics, Physics, or a related field
- Good understanding of distributed computing environments
- Prior working experience with AWS - any or all of EC2, S3, EBS, EMR
משרות נוספות שיכולות לעניין אותך