Multi-disciplinary Big Data Architect to design and build scalable and robust data platforms, Marketing, CRM, campaign management and other back-office applications
Responsible for multi projects at once, being the main source of technical knowledge and experience, facilitating, building (hands-on) and mentoring RnD teams
Responsible for cloud infrastructure, architecture, micro-services, CICD, production environments and for new innovative and complex developments.
Responsible for research, analysis and performing proof of concepts for new technologies, tools and design concepts
Design and development core modules in our big data platform infrastructures (hosting in Google Cloud, services and Kubernetes, based on: Spark Core/Streaming Structure/SQL, Scala, Python, AngularJS, Node.js, Kafka, BigQuery, Redis, Elasticsearch, Google Cloud Machine Learning Engine and TensorFlow)
The platform handles huge amounts of data, through complex processing in batch and real time modes, complex data manipulation, using services, UI frameworks and interactive notebooks.
What we expect
8+ years of practical experience with high level programming languages (Java/Scala/Python etc), excellent programming skills - design patterns, data structures and TDD approach - must
6+ years hands-on experience building large-scale (petabytes), low-latency distributed systems using modern cloud computing technologies (GCP - preferred , AWS) - must.
5+ years experience in building large-scale (petabytes) of Streaming/Batch ETL using modern processing engines (Spark - preferred,
Beam, Flask etc) - must
Expert SQL queries knowledge - you know how to write efficient, low-latency queries vs modern data-warehouse solutions (Bigquery - preferred, Redshift, Athana etc) - must
Experience working with data streams systems (Kafka - preferred , Pub-Sub or Kinesis) - must
English:Advanced-must
5+ years of experience in DevOps architecture with continuous integration/delivery solutions.
Desired
Experience working storage solutions (hdfs, S3, GCS - preferred) - big advantage
Experience working with notebooks solutions (Databricks, Jupyter) - big advantage