As part of our Data and Analytics team of Financial Services Consulting practice you will work with multi-disciplinary teams to support clients in a wide range of big data initiatives aiming to generate and present new, useful and actionable insights. You will have the opportunity to work and take responsibilities in challenging engagements, gaining exposure to clients in various sectors both in Singapore and in the APAC region.
Your Key Responsibilities
- Participation in large-scale client engagements.
- Contribution towards, or even leading, the delivery of innovative and engaging big data solutions.
- Understanding of business and technical requirements, provision of subject matter expertise, and implementation of big data engineering techniques.
- Conducting of data discovery activities, performing root cause analysis, and making recommendations for the remediation of data quality issues.
- Putting into practice good organizational and time management skills, with the ability to prioritize and complete multiple complex projects under tight deadlines.
Skills and Attributes for Success
- Leverage technology to continually learn, improve service delivery and maintain our leading-edge best practices
- Strong presentation skills and proficiency in the use of PowerPoint, Word and Excel
- Good understanding of financial services industry
To Qualify for the role, you must have
- Bachelor or Master’s degree in computer science, Engineering, or other related fields.
- Minimally 6 years of relevant experience. Candidates with more than 9 years of relevant experience can be consider for Senior Manager position.
- Understanding or even practical experience of handling and manipulating semi-structured and unstructured data.
- Deep understanding of big data technology, concepts, tools, features, functions and benefits of different approaches available.
- Ability to deploy, manage, and administer Hadoop-based components.
- Ability to design, build, install, configure and support Hadoop-based applications.
- Experience with one of Java, Python, C# or C++.
- Experience with ETL tools such as Talend, Informatica, AWS Glue, Azure Data Factory. Hands-on experience with Talend is a plus.
- Hands-on experience with HiveQL.
- Familiarity with data ingestion tools such as Kafka, Flume and Sqoop.
- Knowledge of Hadoop related workflow/scheduling tools such as Oozie.
- Understanding of data modeling (ER models) techniques.
- Experience with investigating and handling data quality issues.
Ideally, you’ll also have
- Design or implementation experience of data models in physical form in one (or more) of the leading RDBMS platforms such as SQL Server, Oracle, IBM DB2/Netezza, Teradata, etc.
- Experience with Business Intelligence or statistical analysis tools and techniques.
- Strong communication and business relationship skills to effectively explain analysis, both verbally and in writing, to others and translate analysis into a clear business plan.
- Strong time management and organizational skills to gather and make use of data (both internal and external).
What we offer
- Continuous learning:You’ll develop the mindset and skills to navigate whatever comes next.
- Success as defined by you:We’ll provide the tools and flexibility, so you can make a meaningful impact, your way.
- Transformative leadership:We’ll give you the insights, coaching and confidence to be the leader the world needs.
- Diverse and inclusive culture:You’ll be embraced for who you are and empowered to use your voice to help others find theirs.
Apply now.
EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.