Expoint – all jobs in one place
המקום בו המומחים והחברות הטובות ביותר נפגשים

דרושים Strategic Communications Professional-ey Global ב-Ey ב-India, Bengaluru

מצאו את ההתאמה המושלמת עבורכם עם אקספוינט! חפשו הזדמנויות עבודה בתור Strategic Communications Professional-ey Global ב-India, Bengaluru והצטרפו לרשת החברות המובילות בתעשיית ההייטק, כמו Ey. הירשמו עכשיו ומצאו את עבודת החלומות שלך עם אקספוינט!
חברה (1)
אופי המשרה
קטגוריות תפקיד
שם תפקיד (1)
India
Bengaluru
נמצאו 163 משרות
08.10.2025
EY

EY EY - GDS Consulting AI DATA Gen Senior India, Karnataka, Bengaluru

Limitless High-tech career opportunities - Expoint
תיאור:

Role Overview:

We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role.

Your technical responsibilities:

  • Contribute to the design and implementation of state-of-the-art AI solutions.
  • Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI.
  • Collaborate with stakeholders to identify business opportunities and define AI project goals.
  • Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges.
  • Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases.
  • Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities.
  • Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment.
  • Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs.
  • Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs.
  • Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly.
  • Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency.
  • Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases.
  • Ensure compliance with data privacy, security, and ethical considerations in AI applications.
  • Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications.

Requirements:

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus.
  • Minimum 3-7 years of experience in Data Science and Machine Learning.
  • In-depth knowledge of machine learning, deep learning, and generative AI techniques.
  • Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch.
  • Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models.
  • Familiarity with computer vision techniques for image recognition, object detection, or image generation.
  • Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment.
  • Expertise in data engineering, including data curation, cleaning, and preprocessing.
  • Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems.
  • Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models.
  • Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions.
  • Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels.
  • Understanding of data privacy, security, and ethical considerations in AI applications.
  • Track record of driving innovation and staying updated with the latest AI research and advancements.

Good to Have Skills:

  • Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems.
  • Utilize optimization tools and techniques, including MIP (Mixed Integer Programming).
  • Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models.
  • Implement CI/CD pipelines for streamlined model deployment and scaling processes.
  • Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines.
  • Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation.
  • Implement monitoring and logging tools to ensure AI model performance and reliability.
  • Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment.
  • Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models.



EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.

Expand
08.10.2025
EY

EY EY-GDS Consulting-AI DATA-GCP Architect-Manager India, Karnataka, Bengaluru

Limitless High-tech career opportunities - Expoint
תיאור:

Critical roles and responsibilities for the GCP Architect - Manager

Looking for a highly skilled GCP Architect to lead a migration project, ensuring a seamless transition of workloads, applications, and services to Google Cloud Platform (GCP) while integrating with other cloud environments like AWS and Azure.

Key Responsibilities:

  • Cloud Strategy & Architecture : Design and implement scalable, secure, and cost-effective GCP architectures for a multi-cloud environment.
  • Migration Planning : Develop and execute migration strategies for applications, data, and infrastructure from on-premise or other cloud platforms (AWS/Azure) to GCP.
  • Infrastructure as Code (IaC): Utilize Terraform, Ansible, or other IaC tools for automated provisioning and management of cloud resources.
  • Security & Compliance : Ensure cloud environments adhere to industry security best practices, compliance standards (e.g., ISO, SOC, HIPAA), and Google Cloud security frameworks.
  • CI/CD & DevOps Integration : Work with DevOps teams to integrate CI/CD pipelines using Azure DevOps, GitHub Actions, or Jenkins for cloud deployments.
  • Networking & Hybrid Cloud : Design and implement hybrid and multi-cloud networking solutions, including VPNs, interconnects, and service mesh (Anthos, Istio).
  • Performance & Cost Optimization : Monitor, optimize, and provide recommendations for cloud resource utilization, cost efficiency, and performance enhancements.
  • Stakeholder Collaboration : Work closely with business, security, and engineering teams to align cloud solutions with organizational goals.
  • Incident Management & Troubleshooting : Provide technical leadership for incident resolution, root cause analysis, and continuous improvement in cloud operations.

Experience

  • 7-11 years of experience.

Technical Expertise:

  • Strong hands-on experience with GCP services (Compute Engine, GKE, Cloud Functions, BigQuery, IAM, Cloud Armor, etc.).
  • Familiarity with AWS and/or Azure services and cross-cloud integrations.
  • Proficiency in Terraform, Ansible, or other IaC tools.
  • Experience with containerization (Docker, Kubernetes) and microservices architecture.
  • Strong networking skills, including VPC design, Cloud Interconnect, and hybrid cloud solutions.
  • Understanding of security best practices, encryption, and identity management in a multi-cloud setup.
  • Experience in Migration from on-prem to GCP or hybrid cloud architectures.
  • Experience with Anthos, Istio, or service mesh technologies.
  • Strong scripting skills in Python, Bash, or Go for automation.

Certifications (Preferred):

  • Google Professional Cloud Architect
  • Google Professional Cloud Security Engineer
  • AWS/Azure Architect certifications (nice to have)

Soft Skills:

  • Excellent problem-solving and analytical skills.
  • Strong communication and stakeholder management abilities.
  • Ability to lead and mentor technical teams in a fast-paced environment.



EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.

Expand
08.10.2025
EY

EY FS - RC MS EY Comply RVS Unit Pricing India, Karnataka, Bengaluru

Limitless High-tech career opportunities - Expoint
תיאור:

Position: Staff

Australian Shift (Start time: 3:30 AM IST)

Responsibilities:

  • Asset Transitions & Investment Book of Records
    • Support asset transitions by loading trade files, monitoring settlements, and ensuring accurate handover to investment managers.
    • Run daily batch jobs to generate lifecycle events (e.g., coupons, closeouts) and sign off on the Investment Book of Records (IBOR) before unit pricing.
  • Valuation, Unit Pricing & Fees
    • Run and review all pricing jobs, resolve exceptions, and deliver accurate daily unit pricing for a broad range of Australian funds.
    • Oversee calculation and provisioning of income, tax, performance, management, and custody fees.
    • Perform high-level checks on unit price movements and initiate release of unit prices to registries and reporting systems.
  • Document new procedures and controls to enhance operational risk management.
  • Work with business, architects, and technology partners to translate requirements into scalable solutions and contribute to product roadmaps.

Requirements:

  • 1-3 years of experience in Asset management/WAM sector, Exposure to Australian based asset management will be an addon.
  • Master’s degree in accounting or finance is mandatory with CPA / MBA will be good to have
  • General understanding of Australian Accounting Standards Board (AASB) or International Financial Reporting Standards (IFRS) and regulatory requirements is a plus
  • A Strong understanding of financial industry with fund accounting, reconciliation, expense reporting, tax reporting, asset types and derivatives is mandatory.
  • Hands on experience on Simcorp Dimension is required.
  • Strong understanding of security valuation, market data management, corporate actions, reconciliation processes, and vendors such as Bloomberg
  • Strong understanding of Fixed Income securities, Derivatives products is must.
  • Crisp and effective executive communication skills, including significant experience presenting cross-functionally and across all levels.
  • Understanding BRD would be an added advantage.
  • Should pay attention to detail, proficient in MS applications (Word, Excel, Power Point), excellent analytical skills and must display effective interaction capabilities with various stakeholders.



EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.

Expand
08.10.2025
EY

EY EY - GDS Consulting AI DATA Data Migration Architect Manager India, Karnataka, Bengaluru

Limitless High-tech career opportunities - Expoint
תיאור:

Your key responsibilities

  • Lead and manage large data migration projects, ensuring successful execution and delivery.
  • Design and implement data architecture solutions that align with business requirements and best practices.
  • Collaborate with cross-functional teams to gather requirements and define data migration strategies.
  • Utilize ETL tools, with a preference for Informatica/IICS, to facilitate data extraction, transformation, and loading processes.
  • Ensure data quality and integrity throughout the migration process.
  • Communicate effectively with stakeholders, providing updates on project progress and addressing any concerns.
  • Work in an onshore-offshore model, coordinating with teams across different locations and managing multi-vendor scenarios.

Skills and attributes for success

  • 15+ years of experience in data migration, data architecture, or data engineering.
  • Proven track record of managing large data migration projects successfully.
  • Strong technical management skills with a focus on cloud ecosystems.
  • Experience with ETL tools; Informatica/IICS experience is preferred, but any ETL tool experience is acceptable.
  • Excellent communication skills, with the ability to convey complex technical concepts to non-technical stakeholders.
  • Desirable experience working in an onshore-offshore model and in multi-vendor environments.
  • Be part of a dynamic team driving significant data transformation initiatives.
  • Collaborative work culture that values innovation and professional growth.

Ideally, you’ll also have

  • Client management skills

What we look for

  • Minimum 8 years of experience as Architect on Analytics solutions and around 8 years of experience with Snowflake.
  • People with technical experience and enthusiasm to learn new things in this fast-moving environment

You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer:

  • Support, coaching and feedback from some of the most engaging colleagues around
  • Opportunities to develop new skills and progress your career
  • The freedom and flexibility to handle your role in a way that’s right for you
Expand
07.10.2025
EY

EY EY - GDS Consulting AI DATA -Gen Senior India, Karnataka, Bengaluru

Limitless High-tech career opportunities - Expoint
תיאור:

Job Description: EY GDS – Data and Analytics (D&A) – Senior – Senior Data Scientist

We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role.

Your technical responsibilities:

  • Contribute to the design and implementation of state-of-the-art AI solutions.
  • Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI.
  • Collaborate with stakeholders to identify business opportunities and define AI project goals.
  • Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges.
  • Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases.
  • Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities.
  • Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment.
  • Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs.
  • Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs.
  • Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly.
  • Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency.
  • Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases.
  • Ensure compliance with data privacy, security, and ethical considerations in AI applications.
  • Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications.

Requirements:

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus.
  • Minimum 3-7 years of experience in Data Science and Machine Learning.
  • In-depth knowledge of machine learning, deep learning, and generative AI techniques.
  • Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch.
  • Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models.
  • Familiarity with computer vision techniques for image recognition, object detection, or image generation.
  • Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment.
  • Expertise in data engineering, including data curation, cleaning, and preprocessing.
  • Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems.
  • Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models.
  • Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions.
  • Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels.
  • Understanding of data privacy, security, and ethical considerations in AI applications.
  • Track record of driving innovation and staying updated with the latest AI research and advancements.

Good to Have Skills:

  • Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems.
  • Utilize optimization tools and techniques, including MIP (Mixed Integer Programming).
  • Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models.
  • Implement CI/CD pipelines for streamlined model deployment and scaling processes.
  • Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines.
  • Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation.
  • Implement monitoring and logging tools to ensure AI model performance and reliability.
  • Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment.
  • Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models.



EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.

Expand
07.10.2025
EY

EY FS-RC-MS - EY Comply RVS- Aus Regulatory Reporting India, Karnataka, Bengaluru

Limitless High-tech career opportunities - Expoint
תיאור:

Responsibilities:

Prepare and validate below mentioned Regulatory and Tax reports for Australian Funds:

  • ABS Reporting (Australian Bureau of Statistics)
  • Business Activity Statements (BAS) - Monthly and quarterly
  • Pay as you Go (PAYG) - Monthly and quarterly
  • RG97 Performance Fee Reporting
  • ATO Reporting
  • DTA Monitoring
  • Tax Realignment
  • CVA/DVA assessment for financial reporting
  • Calculation and processing of distributions for all Managed Investment Schemes (MIS)
  • Identify and implement process improvements and automation opportunities, working closely with business and technology teams.
  • Document new procedures and controls to enhance operational risk management.
  • Work with business, architects, and technology partners to translate requirements into scalable solutions and contribute to product roadmaps.
  • Act as a subject matter expert for operational processes, controls, and technology enablement.

Requirements:

  • 4-6 years of experience in Asset management/WAM sector, Exposure to Australian based asset management will be an addon.
  • Master’s degree in accounting or finance is mandatory with CPA/ CA/ MBA will be good to have
  • Experience in financial statement reporting or regulatory reporting to various IFRS based Regulators.
  • General understanding of Australian Accounting Standards Board (AASB) or International Financial Reporting Standards (IFRS) and regulatory requirements is a plus
  • A Strong understanding of financial industry with fund accounting, reconciliation, expense reporting, tax reporting, asset types and derivatives is mandatory.
  • Hands on experience on Simcorp Dimension is required.
  • Strong understanding of security valuation, market data management, corporate actions, reconciliation processes, and vendors such as Bloomberg
  • Strong understanding of Fixed Income securities, Derivatives products is must.
  • Crisp and effective executive communication skills, including significant experience presenting cross-functionally and across all levels.
  • Understanding BRD would be an added advantage.
  • Should pay attention to detail, proficient in MS applications (Word, Excel, Power Point), excellent analytical skills and must display effective interaction capabilities with various stakeholders.



EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.

Expand
07.10.2025
EY

EY EY - GDS Consulting AI DATA Informatica CDGC Manager India, Karnataka, Bengaluru

Limitless High-tech career opportunities - Expoint
תיאור:

Objectives and Purpose

The Data Quality Manager is responsible for driving enterprise-wide data quality and governance initiatives usingInformatica Intelligent Data Management Cloud (IDMC), including Cloud Data Quality (CDQ), Cloud Data Governance & Catalog (CDGC), and related components.

  • Data Quality Strategy & Execution
    • Design and implementcloud-based data profilingframeworks to assess data health across critical datasets (distinct counts, nulls, outliers, patterns, etc.).
    • Develop and maintaindata quality rules and specificationsfor validity, completeness, conformity, and integrity; enable parameterization and reuse across domains.
    • Implementexception management workflows, automate notifications, and coordinate remediation with data stewards.
    • Create and managecleansing and standardizationassets, including parsing, casing, dictionary lookups, address validation, and matching.
    • Build and publishscorecardsto monitor KPIs, thresholds, and trends; socialize findings with business and product stakeholders.
    • Define Critical Data Elements (CDEs)in collaboration with data stewards and align thresholds with governance standards.
    • Managereferencedataand code lists usingReference 360.
  • Data Governance & Metadata Management
    • Configure and operateCDGC and Metadata Command Centre (MCC)for metadata scans, lineage mapping, glossary curation, and asset classification.
    • LinkDQ scorecards to governed assetsto ensure traceability and transparency across the data lifecycle.
    • Exposeend-to-end lineagefrom source to consumption layers, supporting governance and compliance initiatives.
  • Data Engineering & Automation
    • Lead the design, optimization, and maintenance ofdata pipelines and integration frameworksaligned with enterprise ETL and data governance principles.
    • EmbedDQ validation checkpointswithinCDI mappings and taskflowsto ensure continuous data quality enforcement.
    • LeverageIICS REST APIsandPythonfor orchestration, automation, and post-processing of exception extracts.
    • ImplementCloud API integration patterns(OAuth, throttling, managed API consumption) to trigger and monitor DQ flows programmatically.
    • Supportdeployment automation, migration, and operational enablement across environments.
  • Technical Leadership & Collaboration
    • Collaborate with enterprise architects, data scientists, and visualization teams to enable advanced analytics, machine learning, and predictive modelling.
    • Mentor and guide technical teams in DQ best practices, performance optimization, and cloud enablement.
    • Promote reusability, standardization, and a culture of continuous improvement across data engineering and governance functions.
    • Partner with data governance councils to align DQ frameworks with enterprise data policies.

Bachelor’s degreein computer science, Engineering, or Data Science.

Data Quality, Data Governance, or Data Engineering

• Proven expertise in:

  • Informatica IICS / IDMC – Cloud Data Quality, Cloud Data Integration, CDGC, MCC, and Data Marketplace.
      • Cloud Data Profiling,Rule Specifications, Exception Tasks, Scorecards, and Cleanse Assets.
      • CDGC configuration– metadata cataloguing, lineage, glossary, and DQ linkage.
      • Reference 360– code lists, crosswalks, lifecycle management.
      • CDI mappings & taskflows, parameterization, and dependency orchestration.
      • Cloud APIsand automation scripting (Python, REST APIs).
      • Cloud platforms(AWS / Azure),Databricks,Spark, andmodern data architecture(Mesh, Fabric).
      • Data modelling, relational databases, and CI/CD using GitHub / GitLab.

Master’s degreein data or computer science.

• Certifications:Databricks Certified Data Engineer Professional, AWS Certified Data Engineer– Associate.

• Experience withIDMC MDM SaaS, Data Marketplace, and Unity Catalog for governance and access control.

• Familiarity with(GDPR, HIPAA, etc.).

• Exposure topharma or life sciences domain

• Knowledge ofSnowflake, Redshift, Postgres, and NoSQLdata platforms.

• Experience with
orchestration tools.

• Proficiency in SQL and data analysis.

• Strong problem-solving, analytical, and decision-making skills.

• Excellent communication and stakeholder management across business and technology teams.

• Proven leadership in managing distributed teams and driving data quality initiatives.

• Ability to operate in fast-paced environments and manage multiple priorities effectively.

• Commitment to continuous learning, innovation, and cloud modernization.

Expand
Limitless High-tech career opportunities - Expoint
תיאור:

Role Overview:

We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role.

Your technical responsibilities:

  • Contribute to the design and implementation of state-of-the-art AI solutions.
  • Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI.
  • Collaborate with stakeholders to identify business opportunities and define AI project goals.
  • Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges.
  • Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases.
  • Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities.
  • Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment.
  • Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs.
  • Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs.
  • Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly.
  • Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency.
  • Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases.
  • Ensure compliance with data privacy, security, and ethical considerations in AI applications.
  • Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications.

Requirements:

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus.
  • Minimum 3-7 years of experience in Data Science and Machine Learning.
  • In-depth knowledge of machine learning, deep learning, and generative AI techniques.
  • Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch.
  • Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models.
  • Familiarity with computer vision techniques for image recognition, object detection, or image generation.
  • Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment.
  • Expertise in data engineering, including data curation, cleaning, and preprocessing.
  • Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems.
  • Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models.
  • Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions.
  • Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels.
  • Understanding of data privacy, security, and ethical considerations in AI applications.
  • Track record of driving innovation and staying updated with the latest AI research and advancements.

Good to Have Skills:

  • Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems.
  • Utilize optimization tools and techniques, including MIP (Mixed Integer Programming).
  • Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models.
  • Implement CI/CD pipelines for streamlined model deployment and scaling processes.
  • Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines.
  • Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation.
  • Implement monitoring and logging tools to ensure AI model performance and reliability.
  • Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment.
  • Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models.



EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.

Expand
בואו למצוא את עבודת החלומות שלכם בהייטק עם אקספוינט. באמצעות הפלטפורמה שלנו תוכל לחפש בקלות הזדמנויות Strategic Communications Professional-ey Global בחברת Ey ב-India, Bengaluru. בין אם אתם מחפשים אתגר חדש ובין אם אתם רוצים לעבוד עם ארגון ספציפי בתפקיד מסוים, Expoint מקלה על מציאת התאמת העבודה המושלמת עבורכם. התחברו לחברות מובילות באזור שלכם עוד היום וקדמו את קריירת ההייטק שלכם! הירשמו היום ועשו את הצעד הבא במסע הקריירה שלכם בעזרת אקספוינט.