This role requires you to work in a shift pattern or non-standard work hours as required. This may include weekend work.
Minimum qualifications:
Bachelor's degree in Computer Science or equivalent practical experience.
6 years of experience with two or more of the following: Web Tech, Data/Big Data, Systems Admin, Machine Learning, Networking, Kubernetes, Oracle, SAP.
Experience designing cloud enterprise solutions and supporting customer projects to completion.
Experience coding in one or more general purpose languages (e.g., Python, Java, Go, C or C++) including data structures, algorithms, and software design.
Experience in customer advocacy.
Preferred qualifications:
Experience with Linux/Unix or other operating systems (e.g., Kernel to Shell, file systems, client-server protocols).
Experience in administering and querying data in distributed or investigative oriented databases or distributed data processing frameworks.
Experience with open source distributed storage and processing utilities in the Apache Hadoop family or workflow orchestration products.
Experience in data analytics, warehousing, ETL development, data science or other Big Data applications.
Knowledge in Web application development/deployment, HTTP/RESTful Application Programming Interface (API) troubleshooting, or database design/troubleshooting.
Knowledge of Networking fundamentals (Transmission Control Protocol/Internet Protocols (TCP/IP), Routing, Virtual Private Networks (VPNs)).