Expoint – all jobs in one place
Finding the best job has never been easier

Fs Rc-ey Comply Rvs - Axiom jobs at Ey in India, Pune

Discover your perfect match with Expoint. Search for job opportunities as a Fs Rc-ey Comply Rvs - Axiom in India, Pune and join the network of leading companies in the high tech industry, like Ey. Sign up now and find your dream job with Expoint
Company (1)
Job type
Job categories
Job title (1)
India
Pune
60 jobs found
Yesterday
EY

EY EY - GDS Consulting AI DATA AWS Plus DBX Data Engineer Senio... India, Maharashtra, Pune

Limitless High-tech career opportunities - Expoint
Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating...
Description:

Your key responsibilities

  • Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data.
  • Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 3 – 5 years]
  • Need to understand current & Future state enterprise architecture.
  • Need to contribute in various technical streams during implementation of the project.
  • Provide product and design level technical best practices
  • Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions
  • Define and develop client specific best practices around data management within a Hadoop environment or cloud environment
  • Recommend design alternatives for data ingestion, processing and provisioning layers
  • Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark
  • Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies

AWS

  • Experience building on AWS using S3, EC2, Redshift, Glue, EMR, DynamoDB, Lambda, Quick Sight, etc.
  • Experience in Pyspark/Spark / Scala
  • Experience using software version control tools (Git, Jenkins, Apache Subversion)
  • AWS certifications or other related professional technical certifications
  • Experience with cloud or on-premises middleware and other enterprise integration technologies.
  • Experience in writing MapReduce and/or Spark jobs.
  • Demonstrated strength in architecting data warehouse solutions and integrating technical components.
  • Good analytical skills with excellent knowledge of SQL.
  • 3+ years of work experience with very large data warehousing environment
  • Excellent communication skills, both written and verbal
  • 3+ years of experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT, and reporting/analytic tools.
  • 3+ years of experience data modelling concepts
  • 3+ years of Python and/or Java development experience
  • 3+ years’ experience in Big Data stack environments (EMR, Hadoop, MapReduce, Hive)

Skills and attributes for success

  • Architect in designing highly scalable solutions AWS.
  • Strong understanding & familiarity with all AWS/GCP /Bigdata Ecosystem components
  • Strong understanding of underlying AWS/GCP Architectural concepts and distributed computing paradigms
  • Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming
  • Hands on experience with major components like cloud ETLs, Spark, Databricks
  • Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB
  • Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions
  • Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks.
  • Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms
  • Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB
  • Good knowledge in Apache Kafka & Apache Flume
  • Experience in Enterprise grade solution implementations.
  • Experience in performance bench marking enterprise applications
  • Experience in Data security [on the move, at rest]
  • Strong UNIX operating system concepts and shell scripting knowledge

To qualify for the role, you must have

  • Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution.
  • Excellent communicator (written and verbal formal and informal).
  • Ability to multi-task under pressure and work independently with minimal supervision.
  • Strong verbal and written communication skills.
  • Must be a team player and enjoy working in a cooperative and collaborative team environment.
  • Adaptable to new technologies and standards.
  • Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support
  • Responsible for the evaluation of technical risks and map out mitigation strategies
  • Experience in Data security [on the move, at rest]
  • Experience in performance bench marking enterprise applications
  • Working knowledge in any of the cloud platform, AWS or Azure or GCP
  • Excellent business communication, Consulting, Quality process skills
  • Excellent Consulting Skills
  • Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain.
  • Minimum 3-5 years industry experience

Ideally, you’ll also have

  • Strong project management skills
  • Client management skills
  • Solutioning skills

What we look for

  • People with technical experience and enthusiasm to learn new things in this fast-moving environment

You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer:

  • Support, coaching and feedback from some of the most engaging colleagues around
  • Opportunities to develop new skills and progress your career
  • The freedom and flexibility to handle your role in a way that’s right for you



EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.

Show more
08.12.2025
EY

EY FS RC-EY COMPLY RVS - AXIOM India, Maharashtra, Pune

Limitless High-tech career opportunities - Expoint
Ability to proactively foster exceptional client relationships to build trust that leads to the co-development of new opportunities. Sound business acumen to fully comprehend stakeholders’ strategic vision and influence the...
Description:

Art Director, Supervising Associate

As an accomplished event art director, you are an experienced creative champion who leads and influences innovative thinking and strategic visions that help our clients solve their most complex business challenges with unique solutions. You’ll look at every creative task as an experience, always pushing for a unique, memorable and innovative outcome. You’ll advance the work by drawing on vast experience and knowledge, settling for nothing less than the best.

You’ll be part of an expansive and talented team working independently and collaboratively. You’ll be empowered to learn and grow together with other creative minds. Your contributions and ideas will be valued and heard, and you’ll have opportunities to innovate and take part in efforts that advance our creative team.

Your key responsibilities

You’ll lead ideation sessions, mentor teams, and be actively developing award-winning creative. You’ll lead, mentor, rally, create, and inspire. Through clear communication you’ll be able to articulate and present compelling creative concepts that will be essential in gaining stakeholder alignment and advancing project goals.

You’ll collaborate closely with graphic designers, motion graphic designers, videographers, and content strategists to create cohesive, elevated visual content across multi-faceted and complex event channels. Success in this role requires a strong creative vision, a deep expertise in experiential design from strategy through execution, and the capacity to manage multiple event deliverables simultaneously.

Skills and attributes for success

  • Ability to proactively foster exceptional client relationships to build trust that leads to the co-development of new opportunities
  • Sound business acumen to fully comprehend stakeholders’ strategic vision and influence the development of the creative throughout the lifecycle of an event
  • Ability to initiate and successfully lead creative ideation sessions inspiring new design perspectives
  • Proven ability to inspire, guide, and cultivate innovation and creativity by harnessing the strengths of diverse skillsets within a multi-disciplinary creative team.
  • Strong leadership skills with the ability to lead by example

To qualify for the role, you must have

  • Bachelor's degree in graphic design or related discipline or equivalent work experience as an Art Director
  • Eight-plus years of event-related art direction and design experience
  • Comprehensive knowledge in event and creative industry practices, digital trends, innovation and technology demonstrated within a portfolio.
  • Advanced creative conceptual thinking and design skills and the ability to constructively critique colleagues’ concepts
  • Ability to drive development of creative briefs, storyboards and to sell clients on your design and persuade them to follow your design direction
  • Innovative mindset with current knowledge of design and creative technology trends related to events
  • Advanced knowledge of Adobe Creative Suite; proficient in prototyping, Microsoft Office and Teams
  • Experience in successfully leading creative project teams and individuals in implementing event vision, concept and design of deliverables in various platforms and media; reviewing work, troubleshooting and providing feedback
  • Experience in successfully mentoring and coaching creatives in developing conceptual skills, including critiquing design projects and providing constructive feedback in a virtual environment
  • Experience working and mentoring in a fast paced, matrixed agency or in-house team environment to develop a range of creative initiatives
  • Experience working in a virtual environment with flexibility for a hybrid work arrangement - remotely and at local EY office as required by business needs


Ideally, you’ll also have

  • Strong communication skills, active listening and diplomacy when collaborating with internal team members and business partners
  • Advanced problem-solving skills to identify, resolve and overcome challenges


What we offer you
At EY, we’ll develop you with future-focused skills and equip you with world-class experiences. We’ll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn .

  • We offer a comprehensive compensation and benefits package where you’ll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $73,300 to $137,100. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $88,000 to $155,800. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options.
  • Join us in our team-led and leader-enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year.
  • Under our flexible vacation policy, you’ll decide how much vacation time you need based on your own personal circumstances. You’ll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being.


Show more

These jobs might be a good fit

08.12.2025
EY

EY FS RC - BT INS P C India, Maharashtra, Pune

Limitless High-tech career opportunities - Expoint
Property. Auto. General Liability. Casualty Lines Professional Liability, Directors & Officers, Errors & Omissions, EPL, etc. Inland Marine, Cargo. Workers Compensation. Umbrella, Excess Liability. Worked on multiple Business transformation, upgrade...
Description:

P&C (Property & Casualty - Personal and Commercial Insurance)

Candidate should have experience in working in Property & Casualty lines (both Personal and Commercial Insurance), should be familiar with anyone or more functional process – PC, BC, CC. (Preferred Guidewire/Duckcreek)

:must have

  • Property
  • Auto
  • General Liability

Good to have -

  • Casualty Lines Professional Liability, Directors & Officers, Errors & Omissions, EPL, etc
  • Inland Marine, Cargo
  • Workers Compensation
  • Umbrella, Excess Liability

:

  • Worked on multiple Business transformation, upgrade and modernization programs.
  • Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner.
  • Work with the client to define the most optimal future state operational process and related product configuration.
  • Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value.
  • Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams.
  • Work closely with product design development team to analyse and extract functional enhancements.
  • Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle.

Product Experience/Other Skills:

  • Product Knowledge – Guidewire, Duckcreek, Exigent, Majesco. (Preferred Guidewire/Duckcreek)
  • Strong skills in stakeholder management and communication.
  • Should have end to end processes in P&C insurance domain.
  • Data transformation, Business data modelling, Data mapping, transformation rules, data migration, data profiling, data visualisation, Tableau, Power-BI, Reporting, Basic understanding of SQL query
  • AI and Gen AI use cases implementation, Understanding of AI architecture for AI Use cases
  • Should be ready to work in flexible shifts (a good amount of overlap with US/UK hours).
  • Good organizational and time management skills required.
  • Should have good written and verbal communication skills in English.
  • Industry certifications AINS 21 - Property and Liability Insurance Principles, AINS 22 - Personal Insurance, AINS 23 - Commercial Insurance and AINS 24 - General Insurance for IT and Support Professionals will be added advantage.
  • Additional experience in Life or other insurance domain is added advantage.
Show more

These jobs might be a good fit

08.12.2025
EY

EY EY - GDS Consulting AI DATA GCP Architect Manager E India, Maharashtra, Pune

Limitless High-tech career opportunities - Expoint
Cloud Strategy & Architecture: Design and implement scalable, secure, and cost-effective GCP architectures for a multi-cloud environment. Migration Planning: Develop and execute migration strategies for applications, data, and infrastructure from...
Description:

Critical roles and responsibilities for the GCP Architect

Looking for a highly skilled GCP Architect to lead a migration project, ensuring a seamless transition of workloads, applications, and services to Google Cloud Platform (GCP) while integrating with other cloud environments like AWS and Azure.

Key Responsibilities:

  • Cloud Strategy & Architecture: Design and implement scalable, secure, and cost-effective GCP architectures for a multi-cloud environment.
  • Migration Planning: Develop and execute migration strategies for applications, data, and infrastructure from on-premise or other cloud platforms (AWS/Azure) to GCP.
  • Infrastructure as Code (IaC): Utilize Terraform, Ansible, or other IaC tools for automated provisioning and management of cloud resources.
  • Security & Compliance: Ensure cloud environments adhere to industry security best practices, compliance standards (e.g., ISO, SOC, HIPAA), and Google Cloud security frameworks.
  • CI/CD & DevOps Integration: Work with DevOps teams to integrate CI/CD pipelines using Azure DevOps, GitHub Actions, or Jenkins for cloud deployments.
  • Networking & Hybrid Cloud: Design and implement hybrid and multi-cloud networking solutions, including VPNs, interconnects, and service mesh (Anthos, Istio).
  • Performance & Cost Optimization: Monitor, optimize, and provide recommendations for cloud resource utilization, cost efficiency, and performance enhancements.
  • Stakeholder Collaboration: Work closely with business, security, and engineering teams to align cloud solutions with organizational goals.
  • Incident Management & Troubleshooting: Provide technical leadership for incident resolution, root cause analysis, and continuous improvement in cloud operations.

Technical Expertise:

  • Strong hands-on experience with GCP services (Compute Engine, GKE, Cloud Functions, BigQuery, IAM, Cloud Armor, etc.).
  • Familiarity with AWS and/or Azure services and cross-cloud integrations.
  • Proficiency in Terraform, Ansible, or other IaC tools.
  • Experience with containerization (Docker, Kubernetes) and microservices architecture.
  • Strong networking skills, including VPC design, Cloud Interconnect, and hybrid cloud solutions.
  • Understanding of security best practices, encryption, and identity management in a multi-cloud setup.
  • Experience in Migration from on-prem to GCP or hybrid cloud architectures.
  • Experience with Anthos, Istio, or service mesh technologies.
  • Strong scripting skills in Python, Bash, or Go for automation.
  • Should be having 9 – 11 years of experience.

Certifications (Preferred):

  • Google Professional Cloud Architect
  • Google Professional Cloud Security Engineer
  • AWS/Azure Architect certifications (nice to have)

Soft Skills:

  • Excellent problem-solving and analytical skills.
  • Strong communication and stakeholder management abilities.
  • Ability to lead and mentor technical teams in a fast-paced environment.
Show more

These jobs might be a good fit

08.12.2025
EY

EY EY-GDS Consulting-AI DATA-MS Fabric- Manager India, Maharashtra, Pune

Limitless High-tech career opportunities - Expoint
Design, develop, and implement ETL pipelines using Azure Data Factory to extract, transform, and load data from various sources into target systems. Architect and implement Azure Synapse, Data Warehouse, and...
Description:

As part of our EY-DnA team, you will be responsible for designing, developing, and maintaining distributed systems using Microsoft Fabric, including One Lake, Azure Data Factory (ADF), Azure Synapse, Notebooks, Data Warehouse, and Lakehouse. You will play a crucial role in architecting and implementing enterprise data platforms and data management practices, ensuring the delivery of high-quality solutions that meet business requirements. You will collaborate with system architects, business analysts, and stakeholders to understand their requirements and convert them into technical designs. Your role will involve designing, building, testing, deploying, and maintaining robust integration architectures, services, and workflows.

To qualify for the role, you should:

  • Design, develop, and implement ETL pipelines using Azure Data Factory to extract, transform, and load data from various sources into target systems.
  • Architect and implement Azure Synapse, Data Warehouse, and Lakehouse solutions, ensuring scalability, performance, and reliability.
  • Utilize Notebooks and Spark for data analysis, processing, and visualization to derive actionable insights from large datasets.
  • Define and implement enterprise data platform architecture, including the creation of gold, silver, and bronze datasets for downstream use.
  • Hands-on development experience in cloud-based big data technologies, including Azure, Power Platform, Microsoft Fabric/Power BI, leveraging languages such as SQL, PySpark, DAX, Python, and Power Query.
  • Designing and developing BI reports and dashboards by understanding the business requirements, designing the data model, and developing visualizations that provide actionable insights.
  • Collaborate effectively with key stakeholders and other developers to understand business requirements, provide technical expertise, and deliver solutions that meet project objectives.
  • Mentor other developers in the team, sharing knowledge, best practices, and emerging technologies to foster continuous learning and growth.
  • Stay updated on industry trends and advancements in Microsoft Fabric and related technologies, incorporating new tools and techniques to enhance development processes and outcomes.

Skills and attributes for success:

  • 8-11 years of experience in developing data solutions using the Microsoft Azure cloud platform.
  • Strong experience with Azure Data Factory and ETL Pipelines
  • Strong experience with Azure Synapse, Data Warehouse and Lakehouse implementations
  • Strong experience with Notebooks and Spark
  • Background in architecting and implementing enterprise data platforms and data management practise including gold, silver bronze datasets for downstream use.
  • Hands on experience in cloud-based big data technologies including Azure, Power Platform, Microsoft Fabric/Power BI; using languages such as SQL, Pyspark, DAX, Python, Power Query.
  • Creating Business Intelligence (BI) reports and crafting complex Data Analysis Expressions (DAX) for metrics.

Ideally, you’ll also have:

  • Exceptional communication skills and the ability to articulate ideas clearly and concisely.
  • Capability to work independently as well as lead a team effectively.



EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.

Show more

These jobs might be a good fit

08.12.2025
EY

EY FS-RC-BT INS - P C Non US India, Maharashtra, Pune

Limitless High-tech career opportunities - Expoint
Property. Auto. General Liability. Casualty Lines Professional Liability, Directors & Officers, Errors & Omissions, EPL, etc. Inland Marine, Cargo. Workers Compensation. Umbrella, Excess Liability. Experience in creating business process map...
Description:

P&C (Property & Casualty - Personal and Commercial Insurance)

Candidate should have experience in working in Property & Casualty lines (both Personal and Commercial Insurance), should be familiar with anyone or more functional process – PC, BC, CC. (Preferred Guidewire/Duckcreek)

LOBS Line of Business (Personal and Commercial Lines): must have

  • Property
  • Auto
  • General Liability

Good to have -

  • Casualty Lines Professional Liability, Directors & Officers, Errors & Omissions, EPL, etc
  • Inland Marine, Cargo
  • Workers Compensation
  • Umbrella, Excess Liability

Roles and Responsibilities:

  • Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement.
  • Worked on multiple Business transformation, upgrade and modernization programs.
  • Conducted multiple Due-Diligence and Assessment projects as part of Transformation roadmaps to evaluate current state maturity, gaps in functionalities and COTs solution features.
  • Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner.
  • Work with the client to define the most optimal future state operational process and related product configuration.
  • Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value.
  • Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams.
  • Work closely with product design development team to analyse and extract functional enhancements.
  • Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle.

Product Experience/Other Skills:

  • Product Knowledge – Guidewire, Duckcreek, Exigent, Genius, Sapiens, One-Shield, Acquarium, Majesco. (Preferred Guidewire/Duckcreek)
  • Strong skills in stakeholder management, communication, and resolving conflict working with multi-cultural / global stakeholders.
  • Should have handled international client transition and end to end processes in P&C insurance domain.
  • Should be ready to work in flexible shifts (a good amount of overlap with US/UK hours).
  • Good organizational and time management skills required.
  • Should have good written and verbal communication skills in English.
  • Industry certifications AINS 21 - Property and Liability Insurance Principles, AINS 22 - Personal Insurance, AINS 23 - Commercial Insurance and AINS 24 - General Insurance for IT and Support Professionals will be added advantage.
  • Additional experience in Life or other insurance domain is added advantage.
Show more

These jobs might be a good fit

08.12.2025
EY

EY EY - GDS Consulting AI DATA Data Engineering Manager E India, Maharashtra, Pune

Limitless High-tech career opportunities - Expoint
The Data Engineering Manager leads large scale solution architecture design and optimisation to provide streamlined insights to partners throughout the business. This individual leads the team of Mid and Senior...
Description:

Objectives and Purpose

  • The Data Engineering Manager leads large scale solution architecture design and optimisation to provide streamlined insights to partners throughout the business. This individual leads the team of Mid and Senior data engineers to partner with visualization on data quality and troubleshooting needs.
  • The Data Engineering manager will:
    • Implement data processes for the data warehouse and internal systems
    • Lead a team of Junior and Senior Data Engineers in executing data processes and providing quality, timely data management
    • Managing data architecture, designing ETL process
    • Clean, aggregate and organize data from disparate sources and transfer it to data warehouses.
    • Lead development testing and maintenance of data pipelines and platforms, to enable data quality to be utilized within business dashboards and tools.
    • Support team members and direct reports in refining and validating data sets.
    • Ensure we follow all the enterprise data management and principles. Understanding data management best practices
    • Create, maintain, and support the data platform and infrastructure that enables the analytics front-end; this includes the testing, maintenance, construction, and development of architectures such as high-volume, large-scale data processing and databases with proper verification and validation processes.
    • Enterprise data governance principles - All the projects that are coming are following the process. Data catalogue, data security, access policy control, data lineage, new data domains
    • Understanding of data domains - Pharma commercials, medical affairs, sales and marketing, patient services

Your key responsibilities

  • Lead the design, development, optimization, and maintenance of data architecture and pipelines adhering to ETL principles and business goals.
  • Develop and maintain scalable data pipelines, build out new integrations using AWS native technologies and data bricks to support increases in data source, volume, and complexity.
  • Define data requirements, gather and mine large scale of structured and unstructured data, and validate data by running various data tools in the Big Data Environment.
  • Lead the ad hoc data analysis, support standardization, customization and develop the mechanisms to ingest, analyze, validate, normalize, and clean data.
  • Write unit/integration/performance test scripts and perform data analysis required to troubleshoot data related issues.
  • Implement processes and systems to drive data reconciliation and monitor data quality, ensuring production data is always accurate and available for key stakeholders, downstream systems, and business processes.
  • Lead the evaluation, implementation and deployment of emerging tools and processes for analytic data engineering to improve productivity.
  • Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes.
  • Solve complex data problems to deliver insights that help achieve business objectives.
  • Partner with Business Analysts and Enterprise Architects to develop technical architectures for strategic enterprise projects and initiatives.
  • Coordinate with Data Scientists, visualization developers and other data consumers to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling.
  • Collaborate with AI/ML engineers to create data products for analytics and data scientist team members to improve productivity.
  • Advise, consult, mentor and coach other data and analytic professionals on data standards and practices, promoting the values of learning and growth.
  • Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions.

Preferred Skillsets

  • Bachelor's degree in Engineering, Computer Science, Data Science, or related field
  • 10+ years of experience in software development, data engineering, ETL, and analytics reporting development.
  • Expert in building and maintaining data and system integrations using dimensional data modelling and optimized ETL pipelines.
  • Advanced experience utilizing modern data architecture and frameworks like data mesh, data fabric, data product design
  • Experience with designing data integration frameworks capable of supporting multiple data sources, consisting of both structured and unstructured data
  • Proven track record of designing and implementing complex data solutions
  • Demonstrated understanding and experience using:
    • Data Engineering Programming Languages (i.e., Python, SQL)
    • Distributed Data Framework (e.g., Spark)
    • Cloud platform services (AWS/Azure)
    • Relational Databases
    • Working knowledge of DevOps (Github/Gitlab etc.) with continuous integration
    • AWS knowledge on services like Lambda, DMS, Step Functions, S3, Event Bridge, Cloud Watch, Aurora RDS or related AWS ETL services
    • Data ingestion patterns. Ability to recommend right tools for data ingestion. The individual will also have to collaborate with enterprise architect
    • Knowledge of Data lakes, Data warehouses
    • Databricks/Delta Lakehouse architecture
  • Deep understanding of database architecture, Data modelling concepts and administration.
  • Handson experience of Spark Structured Streaming for building real-time ETL pipelines.
  • Proficient in programming languages (e.g., SQL, Python, Pyspark) to design, develop, maintain, and optimize data architecture/pipelines that fit business goals.
  • Extracts, transforms, and loads data from multiple external/internal sources using Databricks Lakehouse/Data Lake concepts into a single, consistent source to serve business users and data visualization needs.
  • Leverages continuous integration and delivery principles to automate code deployment to elevated environments using GitHub Actions.
  • Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners.
  • Strong organizational skills with the ability to manage multiple projects simultaneously, operating as leading member across globally distributed teams.
  • Strong problem solving and troubleshooting skills.
  • Lead and oversee the code review process within the data engineering team to ensure high-quality, efficient, and maintainable code, while optimizing for performance and scalability.
  • Ability to work in a fast-paced environment and adapt to changing business priorities.
  • Identifying and implementing strategies to optimize AWS / Databricks cloud costs, ensuring efficient and cost-effective use of cloud resources.
  • Hand on experience on Cloud AWS/Azure, Spark. Individual should be open to upskilling, Snowflake (must), Redshift, Postrgres, Spark architecture.

Good to have skillsets

  • Master's degree in engineering specialized in Computer Science, Data Science, or related field
  • Experience working with pharma clients, preferably in commercial pharma, marketing and medical affairs
  • Understanding of Databricks Unity Catalog for effective data governance and implementing robust access control mechanisms is highly advantageous.
  • Demonstrated understanding and experience using:
    • Knowledge in CDK
    • Job orchestration tools like Tidal/Airflow/ or similar
  • Knowledge on No SQL
  • Experience in a global working environment
  • Databricks Certified Data Engineer Professional
  • AWS Certified Data Engineer - Associate



EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.

Show more

These jobs might be a good fit

Limitless High-tech career opportunities - Expoint
Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating...
Description:

Your key responsibilities

  • Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data.
  • Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 3 – 5 years]
  • Need to understand current & Future state enterprise architecture.
  • Need to contribute in various technical streams during implementation of the project.
  • Provide product and design level technical best practices
  • Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions
  • Define and develop client specific best practices around data management within a Hadoop environment or cloud environment
  • Recommend design alternatives for data ingestion, processing and provisioning layers
  • Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark
  • Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies

AWS

  • Experience building on AWS using S3, EC2, Redshift, Glue, EMR, DynamoDB, Lambda, Quick Sight, etc.
  • Experience in Pyspark/Spark / Scala
  • Experience using software version control tools (Git, Jenkins, Apache Subversion)
  • AWS certifications or other related professional technical certifications
  • Experience with cloud or on-premises middleware and other enterprise integration technologies.
  • Experience in writing MapReduce and/or Spark jobs.
  • Demonstrated strength in architecting data warehouse solutions and integrating technical components.
  • Good analytical skills with excellent knowledge of SQL.
  • 3+ years of work experience with very large data warehousing environment
  • Excellent communication skills, both written and verbal
  • 3+ years of experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT, and reporting/analytic tools.
  • 3+ years of experience data modelling concepts
  • 3+ years of Python and/or Java development experience
  • 3+ years’ experience in Big Data stack environments (EMR, Hadoop, MapReduce, Hive)

Skills and attributes for success

  • Architect in designing highly scalable solutions AWS.
  • Strong understanding & familiarity with all AWS/GCP /Bigdata Ecosystem components
  • Strong understanding of underlying AWS/GCP Architectural concepts and distributed computing paradigms
  • Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming
  • Hands on experience with major components like cloud ETLs, Spark, Databricks
  • Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB
  • Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions
  • Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks.
  • Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms
  • Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB
  • Good knowledge in Apache Kafka & Apache Flume
  • Experience in Enterprise grade solution implementations.
  • Experience in performance bench marking enterprise applications
  • Experience in Data security [on the move, at rest]
  • Strong UNIX operating system concepts and shell scripting knowledge

To qualify for the role, you must have

  • Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution.
  • Excellent communicator (written and verbal formal and informal).
  • Ability to multi-task under pressure and work independently with minimal supervision.
  • Strong verbal and written communication skills.
  • Must be a team player and enjoy working in a cooperative and collaborative team environment.
  • Adaptable to new technologies and standards.
  • Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support
  • Responsible for the evaluation of technical risks and map out mitigation strategies
  • Experience in Data security [on the move, at rest]
  • Experience in performance bench marking enterprise applications
  • Working knowledge in any of the cloud platform, AWS or Azure or GCP
  • Excellent business communication, Consulting, Quality process skills
  • Excellent Consulting Skills
  • Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain.
  • Minimum 3-5 years industry experience

Ideally, you’ll also have

  • Strong project management skills
  • Client management skills
  • Solutioning skills

What we look for

  • People with technical experience and enthusiasm to learn new things in this fast-moving environment

You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer:

  • Support, coaching and feedback from some of the most engaging colleagues around
  • Opportunities to develop new skills and progress your career
  • The freedom and flexibility to handle your role in a way that’s right for you



EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.

Show more
Find your dream job in the high tech industry with Expoint. With our platform you can easily search for Fs Rc-ey Comply Rvs - Axiom opportunities at Ey in India, Pune. Whether you're seeking a new challenge or looking to work with a specific organization in a specific role, Expoint makes it easy to find your perfect job match. Connect with top companies in your desired area and advance your career in the high tech field. Sign up today and take the next step in your career journey with Expoint.