Expoint - all jobs in one place

The point where experts and best companies meet

Limitless High-tech career opportunities - Expoint

Snowflake SOLUTIONS CONSULTANT SNOWPARK 
Costa Rica 
458403918

09.07.2024

AT SNOWFLAKE, YOU WILL:

  • Be responsible for delivering exceptional outcomes for our teams and customers during our modernization projects. You will engage with customers to migrate from legacy environments into Snowpark/Snowflake . You will act as the expert for our customers and partners throughout this process.
  • In addition to customer engagements, you will work with our internal team to provide requirements for our Snowconvert utility, based on project experiences. This ensures that our tooling is continuously improved based on our implementation experience.

WILL HAVE:

  • University degree in computer science, engineering, mathematics or related fields, or equivalent experience
  • Minimum 5 years of experience as a solutions architect, data architect, database administrator, or data engineer.
  • Willingness to forge ahead to deliver outcomes for customers in a new arena, with a new product set
  • Passion for solving complex customer problems
  • Ability to learn new technology and build repeatable solutions/processes
  • Ability to anticipate project roadblocks and have mitigation plans in-hand
  • Experience in Data Warehousing, Business Intelligence, AI/ML, application modernization, or Cloud projects
  • Experience in building realtime and batch data pipelines using Spark and Scala
  • Proven track-record of results with multi-party, multi-year digital transformation engagements
  • Proven ability to communicate and translate effectively across multiple groups from design and engineering to client executives and technical leaders
  • Strong organizational skills, ability to work independently and manage multiple projects simultaneously
  • Outstanding skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations Hands-on experience in a technical role (SQL, data warehousing, cloud data, analytics, or ML/AI)
  • Extensive knowledge of and experience with large-scale database technology (e.g. Snowflake, Netezza, Exadata, Teradata, Greenplum, etc.)
  • Software development experience with Python, Java , Spark and other Scripting languages
  • Proficiency in implementing data security measures, access controls, and design within the Snowflake platform.
  • Internal and/or external consulting experience.

Skillset and Delivery Activities:

  • Have the ability to outline the architecture of Spark and Scala environments
  • Guide customers on architecting and building data engineering pipelines on Snowflake
  • Run workshops and design sessions with stakeholders and customers
  • Create repeatable processes and documentation as a result of customer engagement
  • Scripting using python and shell scripts for ETL workflow
  • Develop best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own
  • Weigh in on and develop frameworks for Distributed Computing, Apache Spark, PySpark, Python, HBase, Kafka, REST based API, and Machine Learning as part of our tools development (Snowconvert) and overall modernization processes