Develop high-performance and scalable Analytics solutions using the Big Data platform to facilitate the collection, storage, and analysis of massive data sets from multiple channels.
Utilize your in-depth knowledge of Hadoop stack and storage technologies, including HDFS, Spark, MapReduce, Yarn, Hive, Sqoop, Impala, Hue, and Oozie, to design and optimize data processing workflows.
Apply your expertise in NoSQL technologies like MongoDB, SingleStore, or HBase to efficiently handle diverse data types and storage requirements.
Implement Near real-time and Streaming data solutions to provide up-to-date information to millions of Bank customers.
Collaborate with cross-functional teams to identify system bottlenecks, benchmark performance, and propose innovative solutions to enhance system efficiency.
Take ownership of defining Big Data strategies and roadmaps for the Enterprise, aligning them with business objectives.
Stay abreast of emerging technologies and industry trends related to Big Data, continuously evaluating new tools and frameworks for potential integration.
Provide guidance and mentorship to junior teammates.
Education*
Bachelor's or Master's degree in Science or Engineering.
Certifications If Any: NA.
Experience Range*
06 Years To 12 Years.
Foundational Skills*
Minimum of 7 years of industry experience, with at least 5 years focused on hands-on work in the Big Data domain.
Highly skilled in Hadoop stack technologies, such as HDFS, Spark, Yarn, Hive, Sqoop, Impala and Hue.
Strong proficiency in programming languages such as Python, Scala, and Bash/Shell Scripting.
Excellent problem-solving abilities and the capability to deliver effective solutions for business-critical applications.
Strong command of Visual Analytics Tools, with a focus on Tableau.
Desired Skills*
Proficiency in NoSQL technologies like HBase, MongoDB, SingleStore, etc.
Experience in Real-time streaming technologies.
Familiarity with Cloud Technologies such as Azure, AWS, or GCP.
Working knowledge of machine learning algorithms, statistical analysis, and programming languages (Python or R) to conduct data analysis and develop predictive models to uncover valuable patterns and trends.
Proficiency in Data Integration and Data Security within the Hadoop ecosystem, including knowledge of Kerberos.