Designing and developing software components using various tools viz., PySpark, Sqoop, Flume, Azure Databricks and etc.
Perform detailed analysis and effectively interact with the onshore/offshore team members.
Ensure all deliverables conform to the highest quality standards and are executed in a timely manner.
Work independently with minimum supervision.
The role is deadline oriented and may require working under US time schedule.
Identify areas of improvement and bring in a change to streamline the work environment.
Conducting performance tests.
Consulting with the design team.
Ensuring high performance of applications and providing support.
Team player and works well with development/ product engineering teams
Problem solver and good at troubleshooting complex problems to find the root cause and provide solution.
Passionate developer who is committed to deliver high-quality solutions/products.
Position Requirements - Staff:
2 – 4 years of experience in BCM or WAM industry, Exposure to US based asset management or fund administration firm will be an add on.
Should have an understanding of data in BCM/WAM space, well versed with KDEs (Funds, Positions, Transactions, Trail Balance, Securities, Investor and etc) and their granularities.
Should be strong in programming languages namely Python.
Should have hands-on experience on Big Data tools namely PySpark, Sqoop, Hive and Hadoop Cluster.
Should have hands-on experience on Cloud technologies, preferably Azure and experience in working with Azure Databricks.
Should be an expert working on databases namely Oracle, SQL Server and exposure to Big Data is a plus.
Knowledge on Data Visualization tools is a plus.
Should be able to write programs to perform file/data validations, EDA and data cleansing.
Should be highly data driven and able to write complex data transformation programs using PySpark, Python.
Experience in data integration and data processing using Spark and Python.
Hands-on experience in creating real time data streamin solutions using Spark Streaming, Flume.
Experience in handling large data set (in terabytes) and writing Spark jobs and hive queries to perform data analysis.
Experience working in an agile environment
EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.