Expoint – all jobs in one place
מציאת משרת הייטק בחברות הטובות ביותר מעולם לא הייתה קלה יותר
Limitless High-tech career opportunities - Expoint

EY EY - GDS Consulting AI DATA Cloud Data 
India, Karnataka, Bengaluru 
19633462

09.12.2025

Job Name-GDS Consulting Cloud Data Architect – Snowflake

The leader will be responsible for accelerating the following:

Strategic Leadership

  • Drive the strategic vision for cloud data modernization, leveragingSnowflake capabilities within EY
  • Lead cross-functional teams across geographies to deliver scalable data solutions that support enterprise transformation.
  • Influence client C-suite stakeholders by articulating the business value of cloud-native data architectures. Market Intelligence and Go-To-Market (GTM)
  • Monitor industry trends, competitor offerings, and emerging technologies in cloud data platforms to inform solution strategy.
  • Collaborate with EY GTM teams to develop differentiated offerings around Snowflake.
  • Contribute to EY’s market positioning by aligning architecture capabilities with client demand and sector-specific needs.

Sales and Business Development

  • Support pre-sales activities including client workshops, RFP responses, and solution demos.
  • Partner with account teams to identify opportunities for cloud data transformation and drive pipeline growth.
  • Translate client requirements into scalable architecture proposals that align with EY’s delivery capabilities.

Thought Leadership

  • Represent EY in industry forums, webinars, and conferences focused on cloud data and analytics.
  • Author whitepapers, blogs, and internal knowledge assets on Snowflake best practices.
  • Mentor junior architects and contribute to EY’s internal capability development programs.

Partnership and Ecosystem Development

  • Build and nurture strategic alliances with Snowflake and Azure to codevelop joint solutions.
  • Engage with technology partners to stay abreast of product roadmaps and integration capabilities.
  • Leverage EY’s ecosystem to deliver end-to-end solutions that span data, AI, cloud, and security.

Essential Functions of the job

  • Lead and architect the migration of data analytics environments from legacy systems to Snowflake with a focus on performance, reliability, and scalability.
  • Define high-level migration roadmaps and future-state reference architectures on Azure cloud.
  • Design and implement end-to-end data analytics solutions leveraging Snowflake utilities (SnowSQL, SnowPipe), notebooks, and big data modeling techniques using Python.
  • Setup, configure, and deploy Snowflake solutions across multidepartment environments.
  • Guide teams in implementing best practices for data transformation, aggregation, and workload management.
  • Address technical inquiries related to customization, integration, and enterprise architecture.
  • Enable business stakeholders to make data-driven decisions through accurate analysis, reporting, and presentation of key findings.

Responsibilities

  • Make strategic decisions on cloud architecture design, resource allocation, and technology investments.
  • Evaluate performance metrics and ROI of Snowflake implementations.
  • Conduct impact analysis of proposed solutions on client operations, including cost-benefit, scalability, and integration feasibility.
  • Prioritize solution components based on business value andtechnical feasibility


Education & Experience

  • Bachelor’s degree in Engineering, Computer Science, or related field.
  • 15-20 years of experience in data architecture, solution design, and implementation.
  • At least two end-to-end implementations of Snowflake and cloud data platforms.
  • Demonstrated success in leading large-scale data migration projects and managing cross-functional teams.
  • Experience in client-facing roles with a track record of delivering high-impact solutions.

Knowledge and Skills Requirements

  • Proven experience in designing and implementing cloud data warehouse solutions (Snowflake) on Azure.
  • Deep understanding of Snowflake architecture including compute/storage separation, RBAC, performance tuning, and advanced features like time travel and zero-copy clone.
  • Expertise in big data modeling, ETL pipeline optimization, and Python-based data engineering.
  • Strong consultative skills with the ability to manage client relationships and drive consensus.
  • Experience in handling complex deployments and multi-tower configurations.
  • Familiarity with Presto/Starburst, shell scripting, and Unix/Windows environments is a plus.