

Share
Responsibilities:
Skills Required:
EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.
These jobs might be a good fit

Share
Responsibilities:
Skills Required:
You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer:
These jobs might be a good fit

Share
Your key responsibilities
• Develop & deploy Azure Databricks in a cloud environment using Azure Cloud services
• Configure and replicate the Azure Databricks platform, including workspaces, clusters, and storage based on existing Hadoop workloads.
• Migration of Hadoop jobs (YARN, SPARK), ETL workflows (Sqoop scripts, Hive), Kafka streams, Impala queries and database objects to Azure Databricks, ensuring efficient data cleaning, transformation, and loading.
• Set up and manage workflow orchestration using Databricks Jobs Scheduler, handling dependencies and scheduling of data processing tasks.
• Implement monitoring solutions to track the health and performance of the Databricks environment, configuring alerting mechanisms for proactive issue resolution.
• Ensure all code is maintained in the existing Git repository, undergoing peer reviews to maintain quality and readability.
• Conduct System Integration Testing (SIT) and QA testing parallel to development, ensuring all critical workflows and reports function correctly.
• Document and address any issues identified during testing, compiling detailed reports for stakeholders.
• Implement the cutover strategy and production migration according to the deployment plan, communicating with stakeholders and ensuring non-restricted hours migration.
• Implement data validation checks to confirm the integrity and completeness of the migrated data.
• Document the deployment process, including encountered issues and their resolutions, and provide stakeholders with information on how to report any issues.
• Provide hyper care support post-deployment, addressing any issues promptly and transitioning to standard operational support as the environment stabilizes.
Skills and attributes for success
• 3 to 7 years of Experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL, and data warehouse solutions
• Extensive hands-on experience implementing data migration and data processing using Azure services: Databricks (notebooks, workspaces, delta lake, clusters), ADLS, Azure Data Factory, Azure Functions, Synapse/DW, Azure SQL
DB, Azure Event Hubs etc.
• Hands on experience in programming like python/pyspark
• Need to have good knowledge on DWH concepts and implementation knowledge in ACID transactions
• Well versed in DevOps and CI/CD deployments
• Must have hands on experience in SQL and procedural SQL languages
• Strong analytical skills and enjoys solving complex technical problems
To qualify for the role, you must have
• Be a computer science graduate or equivalent with 3 to 7years of industry experience
• Have working experience in an Agile base delivery methodology (Preferable)
• Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution.
• Strong analytical skills and enjoys solving complex technical problems
• Proficiency in Software Development Best Practices
• Excellent debugging and optimization skills
• Experience in Enterprise grade solution implementations & in converting business problems/challenges to technical solutions considering security, performance, scalability etc
• Excellent communicator (written and verbal formal and informal).
• Participate in all aspects of solution delivery life cycle including analysis, design, development, testing, production deployment, and support.
• Client management skills
Ideally, you’ll also have
• Experience on HealthCare domains
Skills and attributes for success
• Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates
• Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations.
• Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint.
What we look for
• A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment
• An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide.
• Opportunities to work with EY Advisory practices globally with leading businesses across a range of industries
You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer:
• Support, coaching and feedback from some of the most engaging colleagues around
• Opportunities to develop new skills and progress your career
• The freedom and flexibility to handle your role in a way that’s right for you
Apply now
EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.
These jobs might be a good fit

Share
Responsibilities:
Skills Required:
EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.
These jobs might be a good fit

Share
Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role.
Your technical responsibilities:
• Contribute to the design and implementation of state-of-the-art AI solutions.
• Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI.
• Collaborate with stakeholders to identify business opportunities and define AI project goals.
• Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges.
• Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases.
• Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities.
• Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment.
• Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs.
• Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs.
• Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly.
• Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency.
• Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases.
• Ensure compliance with data privacy, security, and ethical considerations in AI applications.
• Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications.
Requirements:
• Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus.
• Minimum 3-7 years of experience in Data Science and Machine Learning.
• In-depth knowledge of machine learning, deep learning, and generative AI techniques.
• Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch.
• Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models.
• Familiarity with computer vision techniques for image recognition, object detection, or image generation.
• Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment.
• Expertise in data engineering, including data curation, cleaning, and preprocessing.
• Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems.
• Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models.
• Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions.
• Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels.
• Understanding of data privacy, security, and ethical considerations in AI applications.
• Track record of driving innovation and staying updated with the latest AI research and advancements.
Good to Have Skills:
• Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems.
• Utilize optimization tools and techniques, including MIP (Mixed Integer Programming).
• Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models.
• Implement CI/CD pipelines for streamlined model deployment and scaling processes.
• Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines.
• Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation.
• Implement monitoring and logging tools to ensure AI model performance and reliability.
• Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment.
• Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models
EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.
These jobs might be a good fit

Share
Must have experience
• Design, build, and maintain scalable pipelines using PySpark on AWS.
• Knowlede of Databricks on AWS is a must have
• Work with AWS services such as S3, Glue, EMR, and Redshift for data storage, transformation, and querying.
• Hands-on experience with Redshift, including performance tuning and data modelling. Strong SQL skills with experience in querying and optimizing large datasets.
• Manage and monitor data workflows using orchestration tools like Apache Airflow
• Knowledge of CI/CD workflows for data engineering projects.
• Utilize Git for version control, ensuring proper collaboration and tracking of code changes.
• Establish and follow best practices for repository management, branching, and code reviews.
• Good to have DBT Exposure -Contribute to DBT transformations and assist in setting up data modeling workflows.
• Working experience on Agile and Scrum medthodlogies
EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.
These jobs might be a good fit

Share
Must have experience
• Design, build, and maintain scalable pipelines using PySpark on AWS.
• Knowlede of Databricks on AWS is a must have
• Work with AWS services such as S3, Glue, EMR, and Redshift for data storage, transformation, and querying.
• Hands-on experience with Redshift, including performance tuning and data modelling. Strong SQL skills with experience in querying and optimizing large datasets.
• Manage and monitor data workflows using orchestration tools like Apache Airflow
• Knowledge of CI/CD workflows for data engineering projects.
• Utilize Git for version control, ensuring proper collaboration and tracking of code changes.
• Establish and follow best practices for repository management, branching, and code reviews.
• Good to have DBT Exposure -Contribute to DBT transformations and assist in setting up data modeling workflows.
• Working experience on Agile and Scrum medthodlogies
EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.
These jobs might be a good fit

Responsibilities:
Skills Required:
EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.
These jobs might be a good fit