Finding the best job has never been easier
Share
Duties: Develop and design ETL solutions based on business requirements and aggressive delivery timelines. Understand Business and Functional Requirements and convert into Technical Design Documents to deliver on requirements. Ensure best practices are followed. Prepare detailed level test plans and ensure proper testing for each module developed. Prepare handover documents and manage SIT with oversight of UAT and Production Implementation. Identify and proactively resolve issues that could affect system performance, reliability, and usability. Ensure process compliance and manage expectations of leadership. Explore existing application systems and determine areas of complexity and potential risks to successful implementation. Build and provision Big Data Cluster for SIT / UAT environments and design end-to-end big data pipeline and testing. Extract and analyze sample data from operational systems and ingest into big data Hadoop platform to validate user requirements to create high-level design documents. A telecommuting/hybrid work schedule may be permitted within a commutable distance from the worksite, in accordance with Citi policies and protocols.
Requirements: Bachelor’s degree, or foreign equivalent, in Electronic Engineering, Computer Engineering, or a related field, and six (6) years of experience in the job offered or in a related occupation. Six (6) years of experience must include: Designing and developing large scale data warehousing applications and generic ETL frameworks utilizing Ab Initio, Express IT, ACE/BRE, and Control Center; Working with schemas including Star and Snowflake to meet reporting query and business analysis requirements; Performing unit testing and integration testing utilizing functional testing tools including HPQC and JIRA; Working with Big Data including Hadoop, Apache Spark, Hive, Hue, Query IT, Avro, Parquet and Rules engines; Delivering enterprise-scale applications in Bigdata technologies; Building a Data Quality framework consisting of a common set of model components and patterns that can be extended to implement complex process controls and data quality measurements using Hadoop, Angular, and Spark; and Performing continuous integration and delivery of applications using Jenkins, Bit Bucket, SonarQube, and Blackduck. 40 hrs./wk. Applicants submit resumes at . Please reference Job ID #24784385. EO Employer.
Wage Range: $147,653.50 to $173,500.00
Full timeTampa Florida United States
Anticipated Posting Close Date:
View the " " poster. View the .
View the .
View the
These jobs might be a good fit