Finding the best job has never been easier
Share
This job is responsible for the execution of data architectural solutions for complex initiatives that span multiple Lines of Business and control functions. Key responsibilities include facilitating solution driven discussions, working with stakeholders to support adherence to the enterprise data management policy and standards, and supporting architecture design reviews to ensure integration of data architecture principles in technology solutions. Job expectations include educating data management teams on enterprise data architectural principles and data management processes and routines.
Responsibilities:
Skills:
Data Modeler (Band 4)
Job Description:
The Enterprise Independent Testing (EIT) Analytics team is part of the Data, Infrastructure & Strategic Initiatives organization that provides foundational capabilities for EIT, which is the center of excellence that conducts independent testing globally across the company. EIT has established itself as the enterprise utility to test all end-to-end processes on behalf of Second Line of Defense, Frontline Units (FLUs) and Control Functions (CFs), enabling more efficient and effective testing. EIT is transforming the way testing is built and executed with digital transformation, implementing automation and leveraging cutting edge Artificial Intelligence/Natural Language Processing technologies. EIT carries a portfolio of 8,000+ tests and has a global footprint of testing professionals across the United States, Europe, Asia and Latin America as well as Global Business Services (GBS) associates.
EIT Analytics is responsible for partnering with stakeholders to identify data and reporting needs, leverage insight and capabilities to simplify data structure and reporting, delivering an improved experience while increasing focus on risk management.Role Overview:
As a Data Modeler, main responsibilities will involve:
• Review and understand all required data elements.
• Estimate data resource needs.
• Develop and evolve logical and physical data model.
• Establish target data storage solution.
• Coordinate with Data stewards with respect to data policy requirements (retention, HRCI, Privacy, etc.) and ensure data model enables efficient adherence to it.
• Coordinate capture of meta data for data model
• Train EIT resources on data model.
• Oversee migration to target data model.
• Able to influence strategic direction, as well as develop tactical plans.
Required Education & Experience:
• Bachelor’s degrees or above in fields including but not limited to: Finance, Economics, Mathematics, Computer Science, Statistics, Process and Mechanical Engineering, Operations Research, Data Science, Accounting, Business Administration
• 5+ years of relevant work experienceRequired Skills:
Intellectual Curiosity
• Applies critical thinking and connects the dots on how processes relate to one another.
• Demonstrates understanding of and passion for the “why”.
• Looks around the corner, explores uncharted territories with an “outside-in” perspective.
• Life-long learner who not only assertively educates self, but encourages others to learn and grow.
• Feels ownership and accountability for delivering high quality work, able to prioritize effectively, adapt, and meet strict deadlines.
• Ability to recommend and implement process control improvements.
Executive Interaction & Communication
• Strong written, verbal, presentation creation and delivery skills.
• Communications are timely, concise, easy to follow and tailored to topic, audience, and competing priorities.
• Exercises excellent judgment, discerning appropriate moments to challenge or insert point of view.
• Presentations tell a compelling story and influence action.
• Asks the next level of questions, applies context to determine direction.Technical Skills
• Expert knowledge of complex data architecture, including modeling and data science tools and libraries, data warehouses, and machine learning.
• Building data architecture that is optimized for large dataset retrieval, analysis, storage, cleansing, and transformation.
• Designing, developing, and applying scalable software solutions.
• Managing large data sets utilizing tools such as Hadoop.
• Experience with large number of different data storage solutions and understanding of efficient usage.
• Understand the difference between various storage solutions and ability to translate to technical design based on data requirements.
• Experience with Software Development Lifecycle.
These jobs might be a good fit