Your Role and Responsibilities- Skilled Multiple GCP services – GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc.
- Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation
- Ability to analyse data for functional business requirements & front face customer
Required Technical and Professional Expertise
- End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done.
- Expert in SQL – can do data analysis and investigation using Sql queries
- .Implementation Knowledge Advance Sql functions like – Regular Expressions, Aggregation, Pivoting, Ranking, Deduplication etc.
- BigQuery and BigQuery Transformation (using Stored Procedures) • Data modelling concepts – Star & Snowflake schemas, Fact & Dimension table, Joins, Cardinality etc
- GCP Services related to data pipelines like – Workflow, Cloud Composer, Cloud Schedular, Cloud Storage.
Preferred Technical and Professional Expertise
- Understanding of CI/CD & related tools – Git & Terraform
- Other GCP Services like – Dataflow, Cloud Build, Pub/Sub, Cloud Functions, Cloud Run, Cloud Workstation etc • BigQuery Performance tuning
- Spark development exp, Python based API development exp