המקום בו המומחים והחברות הטובות ביותר נפגשים
In addition to customer engagements, you will work with our internal team to provide requirements for our Snowconvert utility, based on project experiences. This ensures that our tooling is continuously improved based on our implementation experience. This role will report to the Director of Data Engineering and Snowpark within the Workload Solutions team in the PS&T organization at Snowflake.
Delivery
Be well-versed in migrations of applications, code, and data onto cloud platforms - and how to lead design of the subsequent services onto Snowflake
Have the ability to outline the architecture of Spark and Scala environments
Guide customers on architecting and building data engineering pipelines on Snowflake
Run workshops and design sessions with stakeholders and customers
Troubleshoot migration issues
Create repeatable processes and documentation as a result of customer engagement
Scripting using python and shell scripts for ETL workflow
Develop best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own
Provide guidance on how to resolve customer-specific technical challenges
Outline a testing strategy and plan
Optimize Snowflake for performance and cost
Product Strategy
Communicate requirements for capabilities on Snowpark conversion for Scala and Spark based back end software modules
Communicate requirements for design and development of back end big data frameworks for enhancements to our Snowpark platform
Weigh in on and develop frameworks for Distributed Computing, Apache Spark, PySpark, Python, HBase, Kafka, REST based API, and Machine Learning as part of our tools development (Snowconvert) and overall modernization processes
Bachelor's degree in a technical discipline or equivalent practical experience
8+ years of experience in a customer-facing technical role dealing with complex, technical implementation projects and with a proven track record of delivering results with multi-party, multi-year digital transformation engagements
Experience in Data Warehousing, Business Intelligence, AI/ML, application modernization, or Cloud projects, including building realtime and batch data pipelines using Spark and Scala
Ability to deliver outcomes for customers in a new arena and with a new product set
Ability to learn new technology and build repeatable solutions/processes
Ability to anticipate project roadblocks and have mitigation plans in-hand
Proven ability to communicate and translate effectively across multiple groups from design and engineering to client executives and technical leaders
Outstanding skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations
Ability and flexibility to travel to work with customers on-site as needed
The following represents the expected range of compensation for this role:
משרות נוספות שיכולות לעניין אותך