The point where experts and best companies meet
Share
and growing regulatory needs. Vast amounts of data assets have been accumulated through the
years. Data fabric built on emerging technologies will facilitate the data being inspected,
cleansed, transformed for support decision-making
This job involves being part of a dynamic team for Markets data Risk Reporting on Cancel &
Corrects and Open / Unconfirmed trades and contributing towards software development of
have an eye for building and optimizing data systems and will work closely with our Systems
Architects, Data Scientists, and Analysts to help direct the flow of data within the pipeline and
ensure consistency of data delivery and utilization across multiple projects.
Role / Position Overview
Olympus (Re-platforming of Ocean), the regulatory reporting infrastructure is being re-built
strategically starting with Equities. As part of Olympus build out, the developer will be working
on to rebuild the Markets data Risk Enterprise Program on Cancel & Corrects along with Open
and Unconfirmed trades data.
We need a strong database developer with thorough understanding of advanced Database
concepts to understand the existing application and then migrate the same to Olympus. The
specific skill sets required are exposure to any RDBMS Database, Python, PySpark or JavaSpark.
Experience in any ETL Tools is a good to have skill set.
Key Responsibilities:
The role will include but not be limited to the following:
•Design/implement data objects using Data Warehousing methodologies including
Oracle or similar relational database tools, SQL and PL/SQL. Implement DWH solution
using Spark SQL and Python on Big Data.
•Identify re-usable database components and develop recommendations for ALPS target
•Align to database programming standards and best practices.
•Develop ETL jobs using Talend 8x for processing of data in a Datawarehouse.
•Collaborate with project teams to refine functional requirements and translate into
technical architecture/design
•Continuously monitor/tune database performance identifying potential
issues/opportunities for improvement and outline recommendations to improve
•Work with development teams ensuring adherence to database standards
•Oversee change management process for database objects across multiple projects
•Accountable for delivery of the database objects through SIT, UAT and Production.
•Liaise with clients to determine requirements and interpret into solutions
•Mentoring and training of junior team members
Development Value:
contribute towards the goal of increasing revenue using key metrics for decision making. The
•8+ years of experience within the technology or banking industry
•Strong Experience in developing ETL solutions using Pyspark and thorough
understanding of advanced DWH concepts.
•Strong hands on experience in developing API modules using Python.•Strong experience/advanced knowledge of designing conceptual, logical & physical data
models and generating initial Data Definition Language
•Very strong database design/development experience using Oracle 12C/19C
•Working experience in Hadoop, Hive Impala
•Expert in SQL & PL/SQL modules such as packages, procedures, functions and other
database objects
•Expert in Database Performance Tuning
•Strong knowledge of DBA skills using Oracle 12C/19C
•Experience with Java will be an added advantage.
•Expert in Big Data querying tools e.g. Hive and Impala.
•Writing Python modules and API related to various Data abstraction layer
•Experience in working with any ETL tool like Talend 7x or higher will be an added
advantage
•Additional Job Description
Additional Job Description
and growing regulatory needs. Vast amounts of data assets have been accumulated through the
years. Data fabric built on emerging technologies will facilitate the data being inspected,
cleansed, transformed for support decision-making
This job involves being part of a dynamic team for Markets data Risk Reporting on Cancel &
Corrects and Open / Unconfirmed trades and contributing towards software development of
have an eye for building and optimizing data systems and will work closely with our Systems
Architects, Data Scientists, and Analysts to help direct the flow of data within the pipeline and
ensure consistency of data delivery and utilization across multiple projects.
Role / Position Overview
Olympus (Re-platforming of Ocean), the regulatory reporting infrastructure is being re-built
strategically starting with Equities. As part of Olympus build out, the developer will be working
on to rebuild the Markets data Risk Enterprise Program on Cancel & Corrects along with Open
and Unconfirmed trades data.
We need a strong database developer with thorough understanding of advanced Database
concepts to understand the existing application and then migrate the same to Olympus. The
specific skill sets required are exposure to any RDBMS Database, Python, PySpark or JavaSpark.
Experience in any ETL Tools is a good to have skill set.
Key Responsibilities:
The role will include but not be limited to the following:
•Design/implement data objects using Data Warehousing methodologies including
Oracle or similar relational database tools, SQL and PL/SQL. Implement DWH solution
using Spark SQL and Python on Big Data.
•Identify re-usable database components and develop recommendations for ALPS target
•Align to database programming standards and best practices.
•Develop ETL jobs using Talend 8x for processing of data in a Datawarehouse.
•Collaborate with project teams to refine functional requirements and translate into
technical architecture/design
•Continuously monitor/tune database performance identifying potential
issues/opportunities for improvement and outline recommendations to improve
•Work with development teams ensuring adherence to database standards
•Oversee change management process for database objects across multiple projects
•Accountable for delivery of the database objects through SIT, UAT and Production.•Liaise with clients to determine requirements and interpret into solutions
•Mentoring and training of junior team members
Development Value:
contribute towards the goal of increasing revenue using key metrics for decision making. The
•8+ years of experience within the technology or banking industry
•Strong Experience in developing ETL solutions using Pyspark and thorough
understanding of advanced DWH concepts.
•Strong hands on experience in developing API modules using Python.
•Strong experience/advanced knowledge of designing conceptual, logical & physical data
models and generating initial Data Definition Language
•Very strong database design/development experience using Oracle 12C/19C
•Working experience in Hadoop, Hive Impala
•Expert in SQL & PL/SQL modules such as packages, procedures, functions and other
database objects
•Expert in Database Performance Tuning
•Strong knowledge of DBA skills using Oracle 12C/19C
•Experience with Java will be an added advantage.
•Expert in Big Data querying tools e.g. Hive and Impala.
•Writing Python modules and API related to various Data abstraction layer
•Experience in working with any ETL tool like Talend 7x or higher will be an added
advantage
Time Type:
View the " " poster. View the .
View the .
View the
These jobs might be a good fit