Job Responsibilities
- Execute standard software solutions, design, development, and technical troubleshooting.
- Write secure and high-quality code using the syntax of at least one programming language with limited guidance.
- Design, develop, code, and troubleshoot with consideration of upstream and downstream systems and technical implications.
- Apply knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation.
- Apply technical troubleshooting to break down solutions and solve technical problems of basic complexity.
- Gather, analyze, and draw conclusions from large, diverse data sets to identify problems and contribute to decision-making in service of secure, stable application development.
- Learn and apply system processes, methodologies, and skills for the development of secure, stable code and systems.
Required Qualifications, Capabilities, and Skills
- Formal training or certification on software engineering concepts and 2+ years applied experience
- Minimum six years of experience in Ab Initio suite of products, including expertise in developing ETL processes using Ab Initio, Abinitio Graphs, Continuous flows, Plans, reusable components, and meta programming.
- Hands-on practical experience in system design, application development, testing, and operational stability.
- Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages.
- Experience in Python coding to handle data flow pipelines. Proficient in SQL, PL/SQL, and familiar with technologies like Oracle and Cloud SQL.
- Hands-on exposure to containerization environments like Docker and Kubernetes deployments, and job orchestration technologies like Control M and Airflow.
- Exposure to methodologies such as CI/CD using Jenkins, Application Resiliency, and Security, Splunk.
Preferred Qualifications, Capabilities, and Skills
- Familiarity with modern data processing frameworks in Big Data like PySpark.
- Exposure to cloud technologies specifically AWS and minimum associate level certification on AWS.
- Databricks certified Data Engineer or equivalent ones.