Expoint - all jobs in one place

מציאת משרת הייטק בחברות הטובות ביותר מעולם לא הייתה קלה יותר

Limitless High-tech career opportunities - Expoint

Snowflake SOLUTIONS ARCHITECT – AI/ML 
Singapore, Singapore 
334018936

19.11.2024
AS A SOLUTIONS ARCHITECT - AI/ML AT SNOWFLAKE, YOU WILL:
  • Be a technical expert on all aspects of Snowflake in relation to the AI/ML workload

  • Build, deploy and ML pipelines using Snowflake features and/or Snowflake ecosystem partner tools based on customer requirements

  • Work hands-on where needed using SQL, Python, Java and/or Scala to build POCs that demonstrate implementation techniques and best practices on Snowflake technology within the Data Science workload

  • Follow best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own

  • Maintain deep understanding of competitive and complementary technologies and vendors within the AI/ML space, and how to position Snowflake in relation to them

  • Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments

  • Provide guidance on how to resolve customer-specific technical challenges

  • Support other members of the Professional Services team develop their expertise

  • Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing

OUR IDEAL SOLUTION ARCHITECT - AI/ML WILL HAVE:
  • Minimum 10 years experience working with customers in a pre-sales or post-sales technical role

  • Skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos

  • Thorough understanding of the complete Data Science life-cycle including feature engineering, model development, model deployment and model management.

  • Strong understanding of MLOps, coupled with technologies and methodologies for deploying and monitoring models

  • Experience and understanding of at least one public cloud platform (AWS, Azure or GCP)

  • Experience with at least one Data Science tool such as AWS Sagemaker, AzureML, Dataiku, Datarobot, H2O, and Jupyter Notebooks

  • Hands-on scripting experience with SQL and at least one of the following; Python, Java or Scala.

  • Experience with libraries such as Pandas, PyTorch, TensorFlow, SciKit-Learn or similar

  • University degree in computer science, engineering, mathematics or related fields, or equivalent experience

BONUS POINTS FOR HAVING:
  • Experience with Databricks/Apache Spark

  • Experience implementing data pipelines using ETL tools

  • Experience working in a Data Science role

  • Proven success at enterprise software

  • Vertical expertise in a core vertical such as FSI, Retail, Manufacturing etc