The point where experts and best companies meet
Share
:
Develop and deliver complex software requirements to accomplish business goals.
Ensure that software is developed to meet functional, non-functional, and compliance requirements.
Code solutions, unit tests, and ensure the solution can be integrated successfully into the overall application/system with clear, robust, and well-tested interfaces.
Perform spike/proof of concept as necessary to mitigate risk or implement new ideas.
Analyze test reports, identify any test issues/errors, and triage the underlying cause.
Document and communicate required information for deployment, maintenance, support, and business functionality.
Implement high-level design documents following DDS architectural standards for extracting, transforming, validating, and loading ETL (Extract Transformation Load) process data dictionaries, Metadata descriptions, file layouts, and flow diagrams.
Create high performance data models in databases like DB2/Oracle for faster querying, streaming, batch processing, and storing transactions.
Develop scalable ETL solutions with consideration to business rules fault-tolerance and error logging.
Work on Unix shell scripts to automate the process and data validations.
Work on confluent Kafka for real time steam processing using Kafka steam and KSQL.
Work on Kafka connect and schema registry and rest proxy servers Hands on experience.
Remote work may be permitted within a commutable distance from the worksite.
:
Bachelor’s degree or equivalent in Computer Science, Computer Information Systems, Management Information Systems, Engineering (any) or related; and
5 years of progressively responsible experience in the job offered or a related IT occupation.
Must include 5 years of experience in each of the following:
Implementing high-level design documents following DDS architectural standards for extracting, transforming, validating, and loading ETL (Extract Transformation Load) process data dictionaries, Metadata descriptions, file layouts, and flow diagrams;
Creating high performance data models in databases like DB2/Oracle for faster querying, streaming, batch processing, and storing transactions;
Developing scalable ETL solutions with consideration to business rules fault-tolerance and error logging;
Working on Unix shell scripts to automate the process and data validations;
Working on confluent Kafka for real time steam processing using Kafka steam and KSQL; and,
Working on Kafka connect and schema registry and rest proxy servers Hands on experience.
If interested apply online at or email your resume to and reference the job title of the role and requisition number.
: $164,000- $174,000
:
1st shift (United States of America)These jobs might be a good fit