Job responsibilities:
- Provides recommendations and insight on data management, governance procedures, and intricacies applicable to the acquisition, maintenance, validation, and utilization of data
- Designs and delivers trusted data collection, storage, access, and analytics data platform solutions in a secure, stable, and scalable way
- Build and support Lake house model enabling thousands of analysts and data scientists
- Build and support scalable fine grain access controls across various platforms
- Support data consumption on the data lake by enabling various bi tools on cloud
- Build Observability & alerting setup for logs, traces and metrics
- Evaluates and reports on access control processes to determine effectiveness of data asset security
- Adds to team culture of diversity, equity, inclusion, and respect
Required qualifications, capabilities, and skills:
- Formal training or certification on software engineering Applications Support concepts and 5+ years of applied experience
- Working experience with both relational and NoSQL databases
- Advanced understanding of database back-up, recovery, and archiving strategies
- Advanced knowledge of linear algebra, statistics, and geometrical algorithms
- Experience presenting and delivering visual data
- 5+ years of experience supporting big data platforms(cloud & on premise)
- Working experience in setting up observability dashboards
- Working experience with AWS data stack, EMR/Glue/Lake Formation/Athena
- Working experience with Big data, Spark & strong Java/python development experience
- Working experience with BI tools
Preferred qualifications, capabilities, and skills:
- Grafana, Splunk, Dynatrace, Datadog
- Experience building API’s is a plus
- Experience with Terraform is a plus