Expoint โ€“ all jobs in one place
ืžืฆื™ืืช ืžืฉืจืช ื”ื™ื™ื˜ืง ื‘ื—ื‘ืจื•ืช ื”ื˜ื•ื‘ื•ืช ื‘ื™ื•ืชืจ ืžืขื•ืœื ืœื ื”ื™ื™ืชื” ืงืœื” ื™ื•ืชืจ

ื“ืจื•ืฉื™ื Ey-gds Consulting-ai Data-power Bi-manager ื‘-Ey ื‘-India, Coimbatore

ืžืฆืื• ืืช ื”ื”ืชืืžื” ื”ืžื•ืฉืœืžืช ืขื‘ื•ืจื›ื ืขื ืืงืกืคื•ื™ื ื˜! ื—ืคืฉื• ื”ื–ื“ืžื ื•ื™ื•ืช ืขื‘ื•ื“ื” ื‘ืชื•ืจ Ey-gds Consulting-ai Data-power Bi-manager ื‘-India, Coimbatore ื•ื”ืฆื˜ืจืคื• ืœืจืฉืช ื”ื—ื‘ืจื•ืช ื”ืžื•ื‘ื™ืœื•ืช ื‘ืชืขืฉื™ื™ืช ื”ื”ื™ื™ื˜ืง, ื›ืžื• Ey. ื”ื™ืจืฉืžื• ืขื›ืฉื™ื• ื•ืžืฆืื• ืืช ืขื‘ื•ื“ืช ื”ื—ืœื•ืžื•ืช ืฉืœืš ืขื ืืงืกืคื•ื™ื ื˜!
ื—ื‘ืจื” (1)
ืื•ืคื™ ื”ืžืฉืจื”
ืงื˜ื’ื•ืจื™ื•ืช ืชืคืงื™ื“
ืฉื ืชืคืงื™ื“ (1)
India
Coimbatore
ื ืžืฆืื• 52 ืžืฉืจื•ืช
Yesterday
EY

EY EY-GDS Consulting-AI DATA-AWS-Senior India, Tamil Nadu, Coimbatore

Limitless High-tech career opportunities - Expoint
Minimum of three years in data engineering ( 3 to 6 years ), Data analytics , programming, database administration, or data management experience. Graduate degree ( Bachelors Degree , BCA...
ืชื™ืื•ืจ:

Provides data solutions by using software to process, store, and serve data to others. Tests data quality and optimizes data availability. Ensures that data pipelines are scalable, repeatable, and secure. Builds a deep dive analytical skillset by working with higher level Data Engineers on a variety of internal and external data.

Intermediate Glue, EMR, Redshift, Step Functions, Foundational Services like IAM, Cloud Watch, Cloud formation, Lambda, Secrets Manager, Sagemaker, Pyspark, SQL

Qualifications:

  • Minimum of three years in data engineering ( 3 to 6 years ), Data analytics , programming, database administration, or data management experience.
  • Graduate degree ( Bachelors Degree , BCA or Masters ) or equivalent combination of training and experience.

Communication And Team Collaboration

  • Strong verbal and written communication
  • Be highly flexible
  • Accountability & E2E ownership

Job Description:

  • Writes ETL (Extract / Transform / Load) processes, designs database systems, and develops tools for real-time and offline analytic processing.
  • Troubleshoots software and processes for data consistency and integrity. Integrates data from a variety of sources for business partners to generate insight and make decisions.
  • Translates business specifications into design specifications and code. Responsible for writing programs, ad hoc queries, and reports. Ensures that all code is well structured, includes sufficient documentation, and is easy to maintain and reuse.
  • Partners with internal clients to gain a basic understanding of business functions and informational needs. Gains working knowledge in tools, technologies, and applications/databases in specific business areas and company-wide systems.
  • Participates in all phases of solution development. Explains technical considerations at related meetings, including those with business clients.
  • Experience building specified AWS cloud architecture and supporting services and technologies (Eg: Glue, EMR, Redshift, Step Functions, Foundational Services like IAM, Cloud Watch, Cloud formation, Lambda, Secrets Manager, Sagemaker).
  • Experience building Spark data processing applications (Python, Pyspark).
  • Experience with Apache airflow ( + iceberg preferred )
  • Experience with SQL development and Tableau Reporting ( Preferred ) .
  • Experience leveraging / building Splunk dashboards.
  • Experience with CI/CD pipeline tools like Bamboo, Bitbucket, Git hub and Terraform scripts / CFS .
  • Tests code thoroughly for accuracy of intended purpose & Regression testing . Reviews end product with the client to ensure adequate understanding. Provides data analysis guidance as required.
  • Provides tool and data support to business users and fellow team members.
  • Good to have : Experience with test automation and test-driven development practice



EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.

Show more
Yesterday
EY

EY EY - GDS Consulting AI DATA Snowflake Developer Staff India, Tamil Nadu, Coimbatore

Limitless High-tech career opportunities - Expoint
Snowflake Development:Design and implement Snowflake-based data warehouses.Develop and optimize SnowProcs, SnowSQL scripts, and advanced Snowflake features. Design and implement Snowflake-based data warehouses. Develop and optimize SnowProcs, SnowSQL scripts, and advanced...
ืชื™ืื•ืจ:

Roles and Responsibilities:

  • Snowflake Development:
    • Design and implement Snowflake-based data warehouses.
    • Develop and optimize SnowProcs, SnowSQL scripts, and advanced Snowflake features.
  • Data Pipeline Design:
    • Build and maintain ETL/ELT workflows for large-scale data ingestion and transformation.
  • Performance Optimization:
    • Tune queries, manage compute resources, and optimize storage for cost and speed.
  • Programming & Automation:
    • Use Python and SQL for data engineering tasks and automation.
  • Collaboration :
    • Work closely with data architects, analysts, and business teams to deliver high-quality data solutions.

Snowflake Expertise:

  • Hands-on experience with Snowflake DW, SnowSQL, and stored procedures.

Programming:

  • Proficiency in SQL and basic Python scripting.

Data Processing:

  • Familiarity with ETL/ELT concepts and frameworks like Airflow or similar.

Database Management:

  • Understanding of schema design and relational database principles.

Preferred Qualifications:

  • Certifications : Snowflake SnowPro Core.
  • Cloud Platforms : Exposure to AWS, Azure, or GCP.
  • Soft Skills : Strong analytical and communication skills.

Experience:

  • Should have 3+ years of Relevant Experience.

What we look for

  • A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment

You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer:

  • Support, coaching and feedback from some of the most engaging colleagues around
  • Opportunities to develop new skills and progress your career
  • The freedom and flexibility to handle your role in a way thatโ€™s right for you
Show more

ืžืฉืจื•ืช ื ื•ืกืคื•ืช ืฉื™ื›ื•ืœื•ืช ืœืขื ื™ื™ืŸย ืื•ืชืš

08.12.2025
EY

EY EY - GDS Consulting AI DATA PowerBI Consultant Senior India, Tamil Nadu, Coimbatore

Limitless High-tech career opportunities - Expoint
5+ years of experience in Power BI development and data visualization. Strong proficiency in DAX, Power Query, and data modelling. Advanced SQL skills for data extraction and transformation. Experience with...
ืชื™ืื•ืจ:

Roles and Responsibilities:

  • 5+ years of experience in Power BI development and data visualization.
  • Strong proficiency in DAX, Power Query, and data modelling
  • Advanced SQL skills for data extraction and transformation.
  • Experience with ETL processes and integrating data from multiple sources.
  • Knowledge of cloud platforms (Azure/AWS) and data warehousing concepts.
  • Excellent communication and stakeholder management skills.

Nice to Have

  • Familiarity with Python or R for advanced analytics.
  • Knowledge of Power BI administration and Row-Level Security (RLS).

Experience :

Should have 5+ years of Relevant Experience.

What we look for

You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer:

  • Support, coaching and feedback from some of the most engaging colleagues around
  • Opportunities to develop new skills and progress your career
  • The freedom and flexibility to handle your role in a way thatโ€™s right for you
Show more

ืžืฉืจื•ืช ื ื•ืกืคื•ืช ืฉื™ื›ื•ืœื•ืช ืœืขื ื™ื™ืŸย ืื•ืชืš

08.12.2025
EY

EY EY - GDS Consulting AI DATA Gen Manager E India, Tamil Nadu, Coimbatore

Limitless High-tech career opportunities - Expoint
Lead and manage a team of AI/ML engineers and data scientists in developing innovative machine learning solutions. Collaborate with clients to understand their requirements and deliver tailored AI/ML solutions. Design,...
ืชื™ืื•ืจ:

Experience: 10+ years

Cloud:Microsoft Azure

AI/ML, Generative AI, Agentic AI, RDBMS, Vector Databases

Responsibilities:

  • Lead and manage a team of AI/ML engineers and data scientists in developing innovative machine learning solutions.
  • Collaborate with clients to understand their requirements and deliver tailored AI/ML solutions.
  • Design, implement, and optimize machine learning models using Python and relevant libraries.
  • Oversee project timelines, resource allocation, and team performance to ensure successful project delivery.
  • Utilize Azure cloud services for deploying and scaling AI/ML applications.
  • Mentor team members and foster a culture of continuous learning and improvement.

Technical Knowhow:

  • 10+ years of experience in AI/ML, with a strong focus on machine learning, GenAI, and Agentic AI.
  • Proven experience in managing teams and client relationships.
  • Proficiency in Python and along with experience in implementing frameworks such as LangChain, LangGraph, Autogen
  • Knowledge of database technologies (SQL, NoSQL) and data management practices.
  • Familiarity with Azure cloud services and deployment strategies.
Show more

ืžืฉืจื•ืช ื ื•ืกืคื•ืช ืฉื™ื›ื•ืœื•ืช ืœืขื ื™ื™ืŸย ืื•ืชืš

08.12.2025
EY

EY EY-GDS Consulting-AI DATA-Power BI-Manager India, Tamil Nadu, Coimbatore

Limitless High-tech career opportunities - Expoint
Bachelorโ€™s degree in Computer Science, Engineering, or a related field with 9โ€“14 years of industry experience. Minimum of 7โ€“10 years of experience in data analytics, with a strong focus on...
ืชื™ืื•ืจ:

Weโ€™re looking for experienced professionals with strong technical expertise in data engineering and analytics, proven delivery capability, and leadership

Job Summary:

As a Manager in the Data and Analytics team, you will be responsible for leading end-to-end BI solution delivery, primarily leveraging PowerBI and ThoughtSpot. You will drive strategic conversations with clients, design robust data models and visualizations, and oversee the implementation of scalable, secure, and high-performing analytics solutions. You will also be responsible for leading teams, mentoring junior consultants, and ensuring excellence in delivery through bestpractices, innovation, and continuous improvement.


Required Skills and Qualifications:

  • Bachelorโ€™s degree in Computer Science, Engineering, or a related field with 9โ€“14 years of industry experience.
  • Minimum of 7โ€“10 years of experience in data analytics, with a strong focus on PowerBI , Tableau and ThoughtSpot.
  • Deep expertise in PowerBI, including data modeling, DAX, Power Query, and report/dashboard development.
  • Hands-on experience with ThoughtSpot, including building Liveboards, Worksheets, SpotIQ analysis, and integrating with cloud/on-prem data sources.
  • Strong SQL skills with the ability to write and optimize complex SQL queries.
  • Proficiency with relational and cloud-based databases such as SQL Server, Azure SQL, Snowflake, etc.
  • Solid understanding of data warehousing, ETL pipelines, and data integration best practices.
  • Demonstrated ability to translate business needs into technical specifications, including performance, scalability, and security considerations.
  • Proven experience in leading analytics projects and managing delivery across multiple stakeholders and teams.
  • Excellent analytical, problem-solving, and decision-making abilities.
  • Strong communication skills, capable of engaging with both technical and non-technical audiences.
  • Ability to mentor and coach junior team members, fostering growth and adherence to best practices in BI and analytics.

Preferred Qualifications:

  • PowerBI certification or equivalent credentials.
  • Familiarity with other BI tools such as Tableau, QlikView/Qlik Sense, etc.
  • Experience with AI-powered analytics tools, with a preference for ThoughtSpot.
  • Knowledge of cloud platforms like Microsoft Azure, AWS, or Google Cloud Platform (GCP).
  • Experience working in consulting environments, with exposure to various industries, domains, and business challenges.
  • Exposure to agile project delivery methodologies and client-facing delivery models
Show more

ืžืฉืจื•ืช ื ื•ืกืคื•ืช ืฉื™ื›ื•ืœื•ืช ืœืขื ื™ื™ืŸย ืื•ืชืš

08.12.2025
EY

EY EY - GDS Consulting AI DATA Data Scientist Senior E India, Tamil Nadu, Coimbatore

Limitless High-tech career opportunities - Expoint
Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating...
ืชื™ืื•ืจ:

โ€“

Your key responsibilities

  • Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data.
  • Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 3 โ€“ 5 years]
  • Need to understand current & Future state enterprise architecture.
  • Need to contribute in various technical streams during implementation of the project.
  • Provide product and design level technical best practices
  • Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions
  • Define and develop client specific best practices around data management within a Hadoop environment or cloud environment
  • Recommend design alternatives for data ingestion, processing and provisioning layers
  • Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark
  • Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies

AWS

  • Experience building on AWS using S3, EC2, Redshift, Glue, EMR, DynamoDB, Lambda, Quick Sight, etc.
  • Experience in Pyspark/Spark / Scala
  • Experience using software version control tools (Git, Jenkins, Apache Subversion)
  • AWS certifications or other related professional technical certifications
  • Experience with cloud or on-premises middleware and other enterprise integration technologies.
  • Experience in writing MapReduce and/or Spark jobs.
  • Demonstrated strength in architecting data warehouse solutions and integrating technical components.
  • Good analytical skills with excellent knowledge of SQL.
  • 3+ years of work experience with very large data warehousing environment
  • Excellent communication skills, both written and verbal
  • 3+ years of experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT, and reporting/analytic tools.
  • 3+ years of experience data modelling concepts
  • 3+ years of Python and/or Java development experience
  • 3+ yearsโ€™ experience in Big Data stack environments (EMR, Hadoop, MapReduce, Hive)

Skills and attributes for success

  • Architect in designing highly scalable solutions AWS.
  • Strong understanding & familiarity with all AWS/GCP /Bigdata Ecosystem components
  • Strong understanding of underlying AWS/GCP Architectural concepts and distributed computing paradigms
  • Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming
  • Hands on experience with major components like cloud ETLs, Spark, Databricks
  • Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB
  • Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions
  • Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks.
  • Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms
  • Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB
  • Good knowledge in Apache Kafka & Apache Flume
  • Experience in Enterprise grade solution implementations.
  • Experience in performance bench marking enterprise applications
  • Experience in Data security [on the move, at rest]
  • Strong UNIX operating system concepts and shell scripting knowledge

To qualify for the role, you must have

  • Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution.
  • Excellent communicator (written and verbal formal and informal).
  • Ability to multi-task under pressure and work independently with minimal supervision.
  • Strong verbal and written communication skills.
  • Must be a team player and enjoy working in a cooperative and collaborative team environment.
  • Adaptable to new technologies and standards.
  • Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support
  • Responsible for the evaluation of technical risks and map out mitigation strategies
  • Experience in Data security [on the move, at rest]
  • Experience in performance bench marking enterprise applications
  • Working knowledge in any of the cloud platform, AWS or Azure or GCP
  • Excellent business communication, Consulting, Quality process skills
  • Excellent Consulting Skills
  • Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain.
  • Minimum 3-5 years industry experience

Ideally, youโ€™ll also have

  • Strong project management skills
  • Client management skills
  • Solutioning skills

What we look for

  • People with technical experience and enthusiasm to learn new things in this fast-moving environment

You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer:

  • Support, coaching and feedback from some of the most engaging colleagues around
  • Opportunities to develop new skills and progress your career
  • The freedom and flexibility to handle your role in a way thatโ€™s right for you



EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.

Show more

ืžืฉืจื•ืช ื ื•ืกืคื•ืช ืฉื™ื›ื•ืœื•ืช ืœืขื ื™ื™ืŸย ืื•ืชืš

08.12.2025
EY

EY EY - GDS Consulting AI DATA Snowflake Senior India, Tamil Nadu, Coimbatore

Limitless High-tech career opportunities - Expoint
Lead and Architect migration of data analytics environment from Teradata to Snowflake with performance and reliability. Develop & deploy big data pipelines in a cloud environment using Snowflake cloud DW....
ืชื™ืื•ืจ:

Your key responsibilities

  • Lead and Architect migration of data analytics environment from Teradata to Snowflake with performance and reliability
  • Develop & deploy big data pipelines in a cloud environment using Snowflake cloud DW
  • ETL design, development and migration of existing on-prem ETL routines to Cloud Service
  • Interact with senior leaders, understand their business goals, contribute to the delivery of the workstreams
  • Design and optimize model codes for faster execution

Skills and attributes for success

  • Hands on developer in the field of data warehousing, ETL
  • Hands on development experience in Snowflake.
  • Experience in Snowflake modelling - roles, schema, databases.
  • Experience in Integrating with third-party tools, ETL, DBT tools
  • Experience in Snowflake advanced concepts like setting up resource monitors and performance tuning would be preferable
  • Applying object-oriented and functional programming styles to real-world Big Data Engineering problems using Java/Scala/Python
  • Develop data pipelines to perform batch and Real - Time/Stream analytics on structured and unstructured data.
  • Data processing patterns, distributed computing and in building applications for real-time and batch analytics.
  • Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflakeโ€™s products and marketing.

To qualify for the role, you must have

  • Be a computer science graduate or equivalent with 3 - 7 years of industry experience
  • Have working experience in an Agile base delivery methodology (Preferable)
  • Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution.
  • Excellent communicator (written and verbal formal and informal).
  • Be a technical expert on all aspects of Snowflake
  • Deploy Snowflake following best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own
  • Work hands-on with customers to demonstrate and communicate implementation best practices on Snowflake technology
  • Maintain deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them
  • Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments
  • Provide guidance on how to resolve customer-specific technical challenges

Ideally, youโ€™ll also have

  • Client management skills

What we look for

  • Minimum 5 years of experience as Architect on Analytics solutions and around 2 years of experience with Snowflake.
  • People with technical experience and enthusiasm to learn new things in this fast-moving environment

You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer:

  • Support, coaching and feedback from some of the most engaging colleagues around
  • Opportunities to develop new skills and progress your career
  • The freedom and flexibility to handle your role in a way thatโ€™s right for you
Show more

ืžืฉืจื•ืช ื ื•ืกืคื•ืช ืฉื™ื›ื•ืœื•ืช ืœืขื ื™ื™ืŸย ืื•ืชืš

Limitless High-tech career opportunities - Expoint
Minimum of three years in data engineering ( 3 to 6 years ), Data analytics , programming, database administration, or data management experience. Graduate degree ( Bachelors Degree , BCA...
ืชื™ืื•ืจ:

Provides data solutions by using software to process, store, and serve data to others. Tests data quality and optimizes data availability. Ensures that data pipelines are scalable, repeatable, and secure. Builds a deep dive analytical skillset by working with higher level Data Engineers on a variety of internal and external data.

Intermediate Glue, EMR, Redshift, Step Functions, Foundational Services like IAM, Cloud Watch, Cloud formation, Lambda, Secrets Manager, Sagemaker, Pyspark, SQL

Qualifications:

  • Minimum of three years in data engineering ( 3 to 6 years ), Data analytics , programming, database administration, or data management experience.
  • Graduate degree ( Bachelors Degree , BCA or Masters ) or equivalent combination of training and experience.

Communication And Team Collaboration

  • Strong verbal and written communication
  • Be highly flexible
  • Accountability & E2E ownership

Job Description:

  • Writes ETL (Extract / Transform / Load) processes, designs database systems, and develops tools for real-time and offline analytic processing.
  • Troubleshoots software and processes for data consistency and integrity. Integrates data from a variety of sources for business partners to generate insight and make decisions.
  • Translates business specifications into design specifications and code. Responsible for writing programs, ad hoc queries, and reports. Ensures that all code is well structured, includes sufficient documentation, and is easy to maintain and reuse.
  • Partners with internal clients to gain a basic understanding of business functions and informational needs. Gains working knowledge in tools, technologies, and applications/databases in specific business areas and company-wide systems.
  • Participates in all phases of solution development. Explains technical considerations at related meetings, including those with business clients.
  • Experience building specified AWS cloud architecture and supporting services and technologies (Eg: Glue, EMR, Redshift, Step Functions, Foundational Services like IAM, Cloud Watch, Cloud formation, Lambda, Secrets Manager, Sagemaker).
  • Experience building Spark data processing applications (Python, Pyspark).
  • Experience with Apache airflow ( + iceberg preferred )
  • Experience with SQL development and Tableau Reporting ( Preferred ) .
  • Experience leveraging / building Splunk dashboards.
  • Experience with CI/CD pipeline tools like Bamboo, Bitbucket, Git hub and Terraform scripts / CFS .
  • Tests code thoroughly for accuracy of intended purpose & Regression testing . Reviews end product with the client to ensure adequate understanding. Provides data analysis guidance as required.
  • Provides tool and data support to business users and fellow team members.
  • Good to have : Experience with test automation and test-driven development practice



EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.

Show more
ื‘ื•ืื• ืœืžืฆื•ื ืืช ืขื‘ื•ื“ืช ื”ื—ืœื•ืžื•ืช ืฉืœื›ื ื‘ื”ื™ื™ื˜ืง ืขื ืืงืกืคื•ื™ื ื˜. ื‘ืืžืฆืขื•ืช ื”ืคืœื˜ืคื•ืจืžื” ืฉืœื ื• ืชื•ื›ืœ ืœื—ืคืฉ ื‘ืงืœื•ืช ื”ื–ื“ืžื ื•ื™ื•ืช Ey-gds Consulting-ai Data-power Bi-manager ื‘ื—ื‘ืจืช Ey ื‘-India, Coimbatore. ื‘ื™ืŸ ืื ืืชื ืžื—ืคืฉื™ื ืืชื’ืจ ื—ื“ืฉ ื•ื‘ื™ืŸ ืื ืืชื ืจื•ืฆื™ื ืœืขื‘ื•ื“ ืขื ืืจื’ื•ืŸ ืกืคืฆื™ืคื™ ื‘ืชืคืงื™ื“ ืžืกื•ื™ื, Expoint ืžืงืœื” ืขืœ ืžืฆื™ืืช ื”ืชืืžืช ื”ืขื‘ื•ื“ื” ื”ืžื•ืฉืœืžืช ืขื‘ื•ืจื›ื. ื”ืชื—ื‘ืจื• ืœื—ื‘ืจื•ืช ืžื•ื‘ื™ืœื•ืช ื‘ืื–ื•ืจ ืฉืœื›ื ืขื•ื“ ื”ื™ื•ื ื•ืงื“ืžื• ืืช ืงืจื™ื™ืจืช ื”ื”ื™ื™ื˜ืง ืฉืœื›ื! ื”ื™ืจืฉืžื• ื”ื™ื•ื ื•ืขืฉื• ืืช ื”ืฆืขื“ ื”ื‘ื ื‘ืžืกืข ื”ืงืจื™ื™ืจื” ืฉืœื›ื ื‘ืขื–ืจืช ืืงืกืคื•ื™ื ื˜.