Bachelor's degree in Computer Science, Engineering, Mathematics, a related field, or equivalent practical experience.
4 years of experience in developing and troubleshooting data processing algorithms and software using Python, Java, Scala, Spark and hadoop frameworks.
Experience in distributed data processing frameworks and modern age GCP analytical and transactional data stores like BigQuery, CloudSQL, AlloyDB etc, and experience in one Database type to write SQLs.
Experience in GCP.
Preferred qualifications:
Experience in working with/on data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools, environments, and data structures.
Experience with encryption techniques like symmetric, asymmetric, HSMs, and envelop, and ability to implement secure key storage using Key Management System.
Experience in building multi-tier, high availability applications with modern technologies such as NoSQL, MongoDB, SparkML, and TensorFlow.
Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments.
Experience in Big Data, information retrieval, data mining, or Machine Learning.
Experience with IaC and CICD tools like Terraform, Ansible, Jenkins etc.