

Share
Your key responsibilities
Design, develop and maintain scalable ETL (Extract, Transform, Load) processes to efficiently extract data from various structured and unstructured sources, ensuring accuracy, consistency and performance optimization.
Architect and manage database systems to support large-scale data storage and retrieval, ensuring high availability, security and efficiency in handling complex datasets.
Integrate and transform data from multiple sources including APIs, on-premises databases and cloud storage, creating unified datasets to support data-driven decision-making across the organization.
Collaborate with business intelligence analysts, data scientists and other stakeholders to understand specific data needs, ensuring the delivery of high-quality, business-relevant datasets.
Monitor, troubleshoot and optimize data pipelines and workflows to resolve performance bottlenecks, improve processing efficiency and ensure data integrity.
Develop automation frameworks for data ingestion, transformation and reporting to streamline data operations and reduce manual effort.
Work with cloud-based data platforms and technologies such as AWS (Redshift, Glue, S3), Google Cloud (Big Query, Dataflow), or Azure (Synapse, Data Factory) to build scalable data solutions.
Optimize data storage, indexing, and query performance to support real-time analytics and reporting, ensuring cost-effective and high-performing data solutions.
Lead or contribute to special projects involving data architecture improvements, migration to modern data platforms, or advanced analytics initiatives.
Skills and attributes for success
A team player with strong analytical, communication and interpersonal skills
Constantly updating yourself about new technologies in the market
A winning personality and the ability to become a trusted advisor to the stakeholders
To qualify for the role, you must have
Minimum 5 years of relevant work experience, with at least 2 years in designing and maintaining data pipelines, ETL processes and database architectures.
Bachelor’s degree (B.E./B.Tech) in Computer Science or IT, or Bachelor’s in Data Science, Statistics, or related field.
Strong expertise in SQL, Python, or Scala for data processing, automation and transformation.
Hands-on experience with big data frameworks such as Apache Spark, Hadoop, or Kafka for real-time and batch processing.
Experience in cloud data platforms including AWS Redshift, Google Big Query, or Azure Synapse, with proficiency in cloud-native ETL and data pipeline tools.
Strong understanding of data modeling principles, relational and NoSQL databases (PostgreSQL, MySQL, MongoDB, Cassandra).
Ideally, you’ll also have
Strong verbal and written communication, facilitation, relationship-building, presentation and negotiation skills.
Be highly flexible, adaptable, and creative.
Comfortable interacting with senior executives (within the firm and at the client)
What we look for
Strong teamwork, work ethic, product mindset, client centricity and a relentless commitment to EY values.
We offer a competitive remuneration package where you’ll be rewarded for your individual and team performance. Our comprehensive Total Rewards package includes support for flexible working and career development, and with FlexEY you can select benefits that suit your needs, covering holidays, health and well-being, insurance, savings and a wide range of discounts, offers and promotions. Plus, we offer:
Support, coaching and feedback from some of the most engaging colleagues around
Opportunities to develop new skills and progress your career
The freedom and flexibility to handle your role in a way that’s right for you
If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible.
Make your mark.
Apply now.
These jobs might be a good fit

Share
General Information
Available for VISA Sponsorship:Yes
Contract Type:Full Time - Permanent
We are looking for people with strong Data related Solutioning and Consulting client facing experience, passion for innovation, technology, data, analytics, and AI, where you will provide a consultative service on a variety of data driven decision making projects within a fast-paced consulting environment.
As a Data Architect, you will support our clients in defining and implementing a data journey aligned with their strategic objectives. The responsibilities of the successful candidate will include:
Your Key Responsibilities:
Key Practice and People Development Responsibilities:
To qualify for the role, you must have
Ideally, you’ll also have
What we look for
We offer a competitive remuneration package. Our comprehensive Total Rewards package includes support for flexible working and career development, and with FlexEY you can select benefits that suit your needs, covering holidays, health and well-being, insurance, savings and a wide range of discounts, offers and promotions. Plus, we offer:
All our employees are given a benefits package which they can tailor to suit their individual preferences. Our range of benefits include:
Career Progression:
Inclusion & Diversity
We recognise the strength that comes from having a diverse workforce and building a culture where we support all our people to achieve their potential. You will be embraced for who you are and empowered to use your voice to help others find theirs.

Share
Job Roles:
Job Qualifications:

Share
In the role as a Senior Principal AI Engineer, you will lead our AI/GEN AI initiatives. You will be responsible for architecting AI/GEN AI solutions, executing our GEN AI/AI strategy, and driving measurable business impact. The Senior AI Engineer will transform our Customer Support business towards a fully autonomous agentic AI system. This role requires an individual capable of translating strategic goals into tangible, high-impact AI solutions, fostering a culture of connection, innovation, and excellence within a dynamic, collaborative environment.
You will
Own the full AI/ML lifecycle from prototype to deployment, ensuring the solutions are robust, scalable, and aligned with AI-product needs
Design, develop, and implement AI agents capable of autonomous decision-making and action
Stay current with the latest AI advancements and proactively apply new technologies to enhance team projects and committed to drive innovation
Be a thought leader by inspiring and driving innovation, promoting, and supporting best practices, and mentoring the development of IPs, research publications and white papers
Essential Requirements:
15+ years’ experience of applied science or complex software systems, especially involving deep learning, machine learning, LLMs that have been successfully delivered
Advanced experience with python, ETL pipelines (Airflow preferred), data warehousing concepts
Strong software development skills, particularly in Python, with experience working with AI frameworks and tools in cloud environments; Knowledge of ML, NLP, Information Retrieval, Recommender Systems and LLMs
Experience with container & orchestration technologies (Docker, Kubernetes etc) and cloud platforms (AWS, GCP or Azure); Experience with training, fine-tuning, and applying large language models (LLMs) for agentic AI application
Proficiency in designing and developing multi-agent systems where multiple AI agents collaborate to achieve complex tasks
Desirable Requirements
PhD or Master's degree in Technology, Computer Science, Machine Learning or equivalent quantitative field
Familiarity leveraging graph-based techniques, semantic search, hybrid search systems and implementing solutions that combine traditional IR methods with machine learning models to enhance search relevancy accuracy and efficiency. Familiarity with large scale data handling when dealing with telemetry systems.
Publication of research papers, patents, or contributions to open-source projects. Recognition as a thought leader in AI, ML, or NLP fields is highly valued.

Share
Requisition Id : 1648259
The opportunity : Executive-National-Assurance-ASU - FAAS - Financial&AccountingAdv - Mumbai
ASU - FAAS - Financial&AccountingAdv :
1) Ensuring their accounts comply with the requisite audit standards
2) Providing a robust and clear perspective to audit committees and
3) Providing critical information for stakeholders.
Our Service Offerings include External Audit, Financial Accounting Advisory Services (FAAS), IFRS & US GAAP conversion, IPO and other public offering, Corporate Treasury - IFRS 9 accounting & implementation support etc.
Your key responsibilities
Technical Excellence
Skills and attributes
To qualify for the role you must have
Qualification
What we look for
People with the ability to work in a collaborative manner to provide services across multiple client departments while following the commercial and legal requirements. You will need a practical approach to solving issues and complex problems with the ability to deliver insightful and practical solutions. We look for people who are agile, curious, mindful and able to sustain postivie energy, while being adaptable and creative in their approach.
What we offer
If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible.

Share
Collaborate with multi-functional engineering teams to deliver the platform and contribute to open-source projects.
Work on building a very resilient distributed system at eBay's edges that can proxy all eBay traffic and also eBay's service mesh for all of our services with unparalleled observability.
Work with other developers in different phases of system development.
Build and debug large scale distributed systems and help with observability and monitoring of the system.
Contribute your knowledge and experience of the networking topology (TCP and HTTP stack) to solve any network related issues encountered by the system, deploying and troubleshooting large scale infrastructure.
Hands-on development experience with strong analytical skills.
Familiarity and proven understanding of networking technologies like BGP, anycast is a must, experience with TCP/IP networking and familiarity with TCP, UDP, SSL and HTTP Protocols.
Hands-on experience with various programming languages like C/C++, GoLang to build the system and also build automation around it for better observability and remediation, any experience with kernel programming and experience with Python is a plus!
Distributed systems with design and implementation of network communication using container networking technologies.
Multi-threaded programming: Telemetry infrastructure to accumulate and export statistical information.
Excellent debugging and triaging skills and strong expertise in agile development.
Familiarity with L4 Networking layer and Kubernetes is a plus.

Share
What you will accomplish
Lead the full lifecycle of software projects — from requirements gathering through design, development, testing, and deployment.
Build and maintain tools that automate network lifecycle management and infrastructure deployment.
Develop and configure monitoring systems to support network engineering operations.
Implement new features, fix issues, and maintain high standards of software quality and test coverage.
Model, manage, and analyze network data to improve automation and operational efficiency.
Collaborate with engineers across infrastructure, operations, and AI teams to develop innovative solutions.
What you will bring
5+ years of software development experience, with at least 3 years in project or functional leadership.
Proficiency in Go, Java, and Python , with strong debugging and problem-solving skills.
Experience with CI/CD pipelines , Docker , Kubernetes , and time-series databases (TSDB).
Proficient in SDLC protocols, data modeling, and systems integration.
Working knowledge of UNIX/Linux environments and familiarity with network protocols (TCP/IP, routing, switching).
Bachelor’s or Master’s degree in Computer Science or related field, or equivalent experience in the field.
Preferred qualifications
Familiarity with LLM-based tools (e.g., ChatGPT, Claude Code) for engineering productivity.
Experience with Cisco Nexus Dashboard Fabric Controller or Adobe Flex .
Strong understanding of network automation, monitoring frameworks , and large-scale distributed systems.
The cool part
Work on modern automation technologies powering one of the world’s largest e-commerce infrastructures.
Collaborate with experienced engineers across networking, AI, and systems development.
Compose eBay’s next-generation network management and monitoring solutions — at global scale.

Share
Your key responsibilities
Design, develop and maintain scalable ETL (Extract, Transform, Load) processes to efficiently extract data from various structured and unstructured sources, ensuring accuracy, consistency and performance optimization.
Architect and manage database systems to support large-scale data storage and retrieval, ensuring high availability, security and efficiency in handling complex datasets.
Integrate and transform data from multiple sources including APIs, on-premises databases and cloud storage, creating unified datasets to support data-driven decision-making across the organization.
Collaborate with business intelligence analysts, data scientists and other stakeholders to understand specific data needs, ensuring the delivery of high-quality, business-relevant datasets.
Monitor, troubleshoot and optimize data pipelines and workflows to resolve performance bottlenecks, improve processing efficiency and ensure data integrity.
Develop automation frameworks for data ingestion, transformation and reporting to streamline data operations and reduce manual effort.
Work with cloud-based data platforms and technologies such as AWS (Redshift, Glue, S3), Google Cloud (Big Query, Dataflow), or Azure (Synapse, Data Factory) to build scalable data solutions.
Optimize data storage, indexing, and query performance to support real-time analytics and reporting, ensuring cost-effective and high-performing data solutions.
Lead or contribute to special projects involving data architecture improvements, migration to modern data platforms, or advanced analytics initiatives.
Skills and attributes for success
A team player with strong analytical, communication and interpersonal skills
Constantly updating yourself about new technologies in the market
A winning personality and the ability to become a trusted advisor to the stakeholders
To qualify for the role, you must have
Minimum 5 years of relevant work experience, with at least 2 years in designing and maintaining data pipelines, ETL processes and database architectures.
Bachelor’s degree (B.E./B.Tech) in Computer Science or IT, or Bachelor’s in Data Science, Statistics, or related field.
Strong expertise in SQL, Python, or Scala for data processing, automation and transformation.
Hands-on experience with big data frameworks such as Apache Spark, Hadoop, or Kafka for real-time and batch processing.
Experience in cloud data platforms including AWS Redshift, Google Big Query, or Azure Synapse, with proficiency in cloud-native ETL and data pipeline tools.
Strong understanding of data modeling principles, relational and NoSQL databases (PostgreSQL, MySQL, MongoDB, Cassandra).
Ideally, you’ll also have
Strong verbal and written communication, facilitation, relationship-building, presentation and negotiation skills.
Be highly flexible, adaptable, and creative.
Comfortable interacting with senior executives (within the firm and at the client)
What we look for
Strong teamwork, work ethic, product mindset, client centricity and a relentless commitment to EY values.
We offer a competitive remuneration package where you’ll be rewarded for your individual and team performance. Our comprehensive Total Rewards package includes support for flexible working and career development, and with FlexEY you can select benefits that suit your needs, covering holidays, health and well-being, insurance, savings and a wide range of discounts, offers and promotions. Plus, we offer:
Support, coaching and feedback from some of the most engaging colleagues around
Opportunities to develop new skills and progress your career
The freedom and flexibility to handle your role in a way that’s right for you
If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible.
Make your mark.
Apply now.
These jobs might be a good fit