Bachelor's degree in Computer Science or equivalent practical experience.
5 years of experience in project management and technical solution delivery.
5 years of experience designing data pipelines, and dimensional data modeling for synch and asynch system integration and implementation using internal (e.g., Flume, etc.) and external stacks (DataFlow, Spark, etc.).
Experience using big data technologies (e.g. Hadoop, Spark, Hive) and database management systems (e.g. SQL, Apache).
Ability to travel 30% of the time, as needed, for client engagements.
Ability to communicate in English and Spanish fluently to support local stakeholders in this region.
Preferred qualifications:
MBA or Master's degree in Computer Science, Engineering or a related field.
Hold a Cloud certification and experience with Kubernetes, GKE, or EKS.
Experience in one or more of the following disciplines: software development, managing large scale Windows or Linux environments, network design and deployment, databases, storage systems.
Experience with networking and system design of load balancers, firewalls, and VPN in architecting, developing and maintaining production-grade systems.
Customer-facing migration experience, including service discovery, assessment, planning, execution, and operations.
Communication, writing, presentation, and problem solving skills.