Skills and attributes for success
- Developing solutions leveraging big data technology (Adobe Experience Platform, Azure Data Factory) to ingest, process and analyze large, disparate data sets to exceed business requirements.
- Estimating technical effort, work breakdown structure, risks, and solutions at a solution level
- Responsibility for delivery of overall technical solution to schedule and time.
- Coaching and mentoring Data Engineers and Senior Data Engineers assigned to the project
- Working with one or more Senior Data Engineers and Data Engineers on the project and working with the SCRUM team to ensure a quality delivery to time, scope, and budget.
- Unifying, enriching, and analyzing customer data to derive insights and opportunities.
- Leveraging in-house data platforms as needed and recommending and building new data platforms and data applications based on solutions patterns as required to exceed business requirements.
- Clearly communicating findings, recommendations, and opportunities to improve data systems and solutions.
- Demonstrating deep understanding of big data technology, concepts, tools, features, functions, and benefits of different approaches
- Seeking out information to learn about emerging methodologies and technologies.
- Clarifying problems by driving to understand the true issue.
- Applying data driven approach (KPIs) in tying technology solutions to specific business outcomes
- Collaborating, influencing, and building consensus through constructive relationships and effective listening
- Solving problems by incorporating data into decision making
Knowledge and Skills Requirements:
- Work experience as a technical lead focused on data modelling and data integration development.
- Demonstrated experience developing data solutions using the Azure Microsoft cloud stack.
- Experience developing solutions using Azure Functions Azure Blob Storage, SQL Data Warehouse
- Experience Data Pipeline/ETL development expertise using Azure Data Factory
- Demonstrated hands-on experience aggregating, querying, organizing, and analysing large data sets
- Experience designing and building data warehouses with a focus on customer profile data
- Strong data modelling and data skills
- Strong data analytics background designing and developing complex SQL queries
- Expertise in developing webhooks and consuming REST API’s with JavaScript/Node.js
- Proficiency coding in at least two of the following: JavaScript, Python, Java, or C#
- Experience with ETL/Batch and Near Real Time Processing
- Communication is essential, must be able to listen and understand the question and develop and deliver clear insights.
- Outstanding team player.
- Independent and able to manage and prioritize workload.
- Ability to adapt to change quickly and positively.
Supervision Responsibilities:
- Collaborating, Mentoring, Grooming Junior, Senior data engineers
Experience:
- At least 10+ years of relevant work experience with a concentration in building complex data integrations.
- 4+ yrs. of overall experience working in Azure Data Factory building integration.
- 3+ yrs. of overall experience working in Adobe Experience platform building data ingestions.
- Expertise & understanding of Adobe experience platform concepts
- Data Management (Schema, Datasets, Queries…)
- Workflows, Connectors (Source, Destination)
- RTCDP, Audience, Segmentation…
- Monitoring
- Policies, Data lifecycle…
- Work experience in a professional services industry, preferred.
Reward statement plus an outline of the culture and environment in this part of the firm. If not covered elsewhere, this may include language about the interesting, varied, and important work; the freedom and autonomy you’ll have in the role; the feedback you will receive that you can learn from to achieve mastery at something; opportunities for personal growth and career advancement; or who and how you will be connected to others.
EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.