Expoint - all jobs in one place

Finding the best job has never been easier

Limitless High-tech career opportunities - Expoint

EY EY - GDS Consulting AI DATA AWS DBX Senior 
India, Karnataka, Bengaluru 
209644151

Yesterday

About the role:

This position report to and receive strategic direction from the Tech Delivery Lead.

How you will contribute:

  • Develop and maintain scalable data pipelines, in line with ETL principles, and build out new integrations, using AWS native technologies, to support continuing increases in data source, volume, and complexity.
  • Define data requirements, gather, and mine data, while validating the efficiency of data tools in the Big Data Environment.
  • Lead the evaluation, implementation and deployment of emerging tools and processes to improve productivity.
  • Implement processes and systems to provide accurate and available data to key stakeholders, downstream systems, and business processes.
  • Partner with Business Analytics and Solution Architects to develop technical architectures for strategic enterprise projects and initiatives.
  • Coordinate with Data Scientists to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling.
  • Mentor and coach junior Data Engineers on data standards and practices, promoting the values of learning and growth.
  • Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions.

Minimum Requirements/Qualifications:

  • Bachelor’s degree in Engineering, Computer Science, Data Science, or related field
  • 5+ years of experience in software development, data science, data engineering, ETL, and analytics reporting development
  • Experience designing, building, implementing, and maintaining data and system integrations using dimensional data modelling and development and optimization of ETL pipelines
  • Proven track record of designing and implementing complex data solutions
  • Demonstrated understanding and experience using:
    • Data Engineering Programming Languages (i.e., Python)
    • Distributed Data Technologies (e.g., Pyspark)
    • Cloud platform deployment and tools (e.g., Kubernetes)
    • Relational SQL databases
    • DevOps and continuous integration
    • AWS cloud services and technologies (i.e., Lambda, S3, DMS, Step Functions, Event Bridge, Cloud Watch, RDS)
    • Databricks/ETL
    • IICS/DMS
    • GitHub
    • Event Bridge, Tidal
  • Understanding of database architecture and administration
  • Utilizes the principles of continuous integration and delivery to automate the deployment of code changes to elevate environments, fostering enhanced code quality, test coverage, and automation of resilient test cases
  • Processes high proficiency in code programming languages (e.g., SQL, Python, Pyspark, AWS services) to design, maintain, and optimize data architecture/pipelines that fit business goals
  • Strong organizational skills with the ability to manage multiple projects simultaneously and operate as a leading member across globally distributed teams to deliver high-quality services and solutions
  • Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners
  • Strong problem solving and troubleshooting skills
  • Ability to work in a fast-paced environment and adapt to changing business priorities



EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.