The point where experts and best companies meet
Share
Job Category
Job Details
Responsibilities
* Architect and implement ETL platforms with open-source technologies such as Spark on Kubernetes, EMR, S3 / SQS, Iceberg, DeltaLake, etc.
* Advance and operate the ETL platforms in a full DevOps model
* Operate in an Agile development environment, including participating in daily scrums
* Support the team’s engineering excellence by performing design reviews, code reviews and mentoring junior team members
* Provide on-call support
Required Skills/Experience
* B.A/B.Sc. in Computer Science/Engineering with 4+ years of industry experience
* Extensive experience building large scale distributed systems using cloud services (AWS, GCP , Azure, etc..)
* Deep understanding of object oriented programming and experience with at least one object oriented programming language (Java, Python)
* Experience with Big Data platforms such as Apache Spark, EMR, EKS, Flink
* Experience building services with Docker and Kubernetes
* Experience with Scrum or other agile development methodologies, with attention to code quality , delivering secure code
* Excellent written and verbal communication skills with outstanding attention to detail
* Have a passion for working together as a team and ensuring team members’ success. Lead and mentor junior members of the team.
Desired Skills/Experience
* Experience with relational and NoSQL databases, message queues (Kafka, SNS/SQS, etc), and other components in data pipelines
* Experience building services with Terraform, Helm, Ansible, and Spinnaker
* Strong fundamentals knowledge in security concepts:authentication/authorizationframeworks (e.g., SSO, SAML, Oauth), secure transport (e.g., SSL, TLS), identity management (e.g., certificates, PKI)
* Good knowledge with operating system and network technologies, such as TCP/IP , DNS, or LB
If you require assistance due to a disability applying for open positions please submit a request via this
Posting Statement
These jobs might be a good fit