:
- Participate in the definition of end-to-end strategic roadmaps for applications with Treasury Data Services and assess progress based on consistent Software Development Lifecycle (SDLC) processes leveraging Agile and SAFE methodologies.
- Work with business partners and perform data analysis to create functional designs from business requirements.
- Participate in building technical solutions and assist in project completion end-to-end for the Corporate Investments Data Warehouse.
- Ensure deliverables comply with Enterprise Change Management standards.
- Enhance the architecture, data model and data capabilities of the Corporate Investments Data Warehouse.
- Work closely with the project developers and architects across multiple projects/work-streams at the same time.
- Design solution architectures for Data warehousing systems using Oracle SQL, Informatica PowerCenter, Teradata, and Python to transform and analyze data.
- Constructing system prototypes, flowcharts, and dataflow to represent the current and future state of the impacted systems using ERWIN, MS Visio, Python, and Airflow.
- Develop data quality rules, perform data profiling and prepare Data Visualization using Alteryx, Tableau, and Power BI.
- Craft data solutions to meet business and enterprise requirements using Quality Center, and Alteryx.
- Analyze large data sets, manipulating, cleansing, and processing data using Pyspark, AWS Cloud, AirFlow and CI/CD cloud deployment.
- Remote work may be permitted within a commutable distance from the worksite.
:
- Bachelor’s degree or equivalent in Information Technology, Engineering (any), Applied Computer Science, Computer Information Systems, Management Information Systems or related,
- 5 years of progressively responsible experience in the job offered or a related IT occupation.
- Must include 5 years of experience in each of the following:
- Designing solution architectures for Data warehousing systems using Oracle SQL, Informatica PowerCenter, Teradata, and Python to transform and analyze data;
- Constructing system prototypes, flowcharts, and dataflow to represent the current and future state of the impacted systems using ERWIN, MS Visio, Python, and Airflow;
- Developing data quality rules, performing data profiling and preparing Data Visualization using Alteryx, Tableau, and Power BI;
- Crafting data solutions to meet business and enterprise requirements using Quality Center, and Alteryx; and,
- Analyzing large data sets, manipulating, cleansing, and processing data using Pyspark, AWS Cloud, AirFlow and CI/CD cloud deployment.
If interested apply online at or email your resume to and reference the job title of the role and requisition number.
: $158,500 - $168,500 per year.
:
1st shift (United States of America)