Your key responsibilities
- Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data.
- Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 3 - 7 years]
- Need to understand current & Future state enterprise architecture.
- Need to contribute in various technical streams during implementation of the project.
- Provide product and design level technical best practices
- Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions
- Define and develop client specific best practices around data management within a Hadoop environment or cloud environment
- Recommend design alternatives for data ingestion, processing and provisioning layers
- Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark
- Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies
AWS
- Experience building on AWS using S3, EC2, Redshift, Glue, EMR, DynamoDB, Lambda, Quick Sight, etc.
- Experience in Pyspark/Spark / Scala
- Experience using software version control tools (Git, Jenkins, Apache Subversion)
- AWS certifications or other related professional technical certifications
- Experience with cloud or on-premises middleware and other enterprise integration technologies.
- Experience in writing MapReduce and/or Spark jobs.
- Demonstrated strength in architecting data warehouse solutions and integrating technical components.
- Good analytical skills with excellent knowledge of SQL.
- 3+ years of work experience with very large data warehousing environment
- Excellent communication skills, both written and verbal
- 3+ years of experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT, and reporting/analytic tools.
- 3+ years of experience data modelling concepts
- 3+ years of Python and/or Java development experience
- 3+ years’ experience in Big Data stack environments (EMR, Hadoop, MapReduce, Hive
Skills and attributes for success
- Architect in designing highly scalable solutions AWS.
- Strong understanding & familiarity with all AWS/GCP /Bigdata Ecosystem components
- Strong understanding of underlying AWS/GCP Architectural concepts and distributed computing paradigms
- Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming
- Hands on experience with major components like cloud ETLs, Spark, Databricks
- Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB
- Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions
- Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks.
- Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms
- Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB
- Good knowledge in Apache Kafka & Apache Flume
- Experience in Enterprise grade solution implementations.
- Experience in performance bench marking enterprise applications
- Experience in Data security [on the move, at rest]
- Strong UNIX operating system concepts and shell scripting knowledge
To qualify for the role, you must have
- Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution.
- Excellent communicator (written and verbal formal and informal).
- Ability to multi-task under pressure and work independently with minimal supervision.
- Strong verbal and written communication skills.
- Must be a team player and enjoy working in a cooperative and collaborative team environment.
- Adaptable to new technologies and standards.
- Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support
- Responsible for the evaluation of technical risks and map out mitigation strategies
- Experience in Data security [on the move, at rest]
- Experience in performance bench marking enterprise applications
- Working knowledge in any of the cloud platform, AWS or Azure or GCP
- Excellent business communication, Consulting, Quality process skills
- Excellent Consulting Skills
- Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain.
- Minimum 10 years industry experience
Ideally, you’ll also have
- Strong project management skills
- Client management skills
- Solutioning skills
What we look for
- People with technical experience and enthusiasm to learn new things in this fast-moving environment
You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer:
- Support, coaching and feedback from some of the most engaging colleagues around
- Opportunities to develop new skills and progress your career
- The freedom and flexibility to handle your role in a way that’s right for you
EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.