The Azure Databricks Data Engineer will be instrumental in building and maintaining an optimized data ecosystem using Azure Databricks. You will be responsible for developing and managing data pipelines, ensuring data quality, and enabling advanced analytics capabilities. Your role will involve close collaboration with data scientists, analysts, and other stakeholders to deliver data-driven solutions that support the strategic goals of EY.
Your Key Responsibilities
As a Data Engineer, you will:
- Design, construct, and maintain scalable data pipelines and ETL/ELT processes across Snowflake and/or Azure Databricks environments.
- Develop data processing workflows using Databricks Notebooks, Spark SQL, and/or Snowflake SQL.
- Optimize data storage and processing performance to support real-time analytics and business intelligence.
- Collaborate with cross-functional teams to gather requirements and translate business needs into technical specifications.
- Build and maintain data models, data marts, and data warehouses to support analytics and reporting.
- Implement data governance, security, and compliance best practices across cloud data platforms.
- Troubleshoot and resolve data processing issues, ensuring high data quality and integrity.
- Provide technical guidance on platform capabilities and mentor junior data engineers.
- Stay current with the latest features and trends in Snowflake, Databricks, and cloud data engineering.
- Develop and maintain documentation related to data pipeline architecture, development processes, and governance.
Skills and Attributes for Success
- Certifications such as Snowflake SnowPro, Azure Data Engineer Associate, or Azure Data Scientist Associate.
- Experience with BI tools like Power BI, Tableau, or Looker.
- Familiarity with machine learning frameworks, data science concepts, and big data tools.
- Experience with version control systems (e.g., Git) and workflow orchestration tools (e.g., Apache Airflow).
- Understanding of DevOps practices, CI/CD pipelines, and data integration techniques.
To Qualify for the Role, You Must Have
- A Bachelor's or Master’s degree in Computer Science, Engineering, Information Systems, or a related field.
- Proven experience as a Data Engineer with hands-on expertise in Snowflake, Databricks, and/or Azure.
- Strong programming skills in Python, Scala, or Java.
- Proficiency in SQL, data modeling, and data warehousing concepts.
- Knowledge of cloud platforms such as Azure, AWS, or GCP, and their integration with Snowflake.
- Experience with Apache Spark and its integration with Databricks.
- Excellent analytical, problem-solving, and communication skills.
- Ability to work collaboratively in a fast-paced, team-oriented environment.
What we offer
We offer a competitive compensation package where you’ll be rewarded based on your performance and recognized for the value you bring to our business. In addition, our Total Rewards package allows you decide which benefits are right for you and which ones help you create a solid foundation for your future. Our Total Rewards package includes a comprehensive medical, prescription drug and dental coverage, a defined contribution pension plan, a great vacation policy plus firm paid days that allow you to enjoy longer long weekends throughout the year, statutory holidays and paid personal days (based on province of residence), and a range of exciting programs and benefits designed to support your physical, financial and social well-being. Plus, we offer:
- Support and coaching from some of the most engaging colleagues in the industry
- Learning opportunities to develop new skills and progress your career
- The freedom and flexibility to handle your role in a way that’s right for you
EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.