Job responsibilities
- Regularly provides technical guidance and direction to support the business and its technical teams, contractors, and vendors
- Provides recommendations and insight on data management, governance procedures, and intricacies applicable to the acquisition, maintenance, validation, and utilization of data
- Designs and delivers trusted data search, storage, access, and analytics data platform solutions in a secure, stable, and scalable way
- Build Observability & alerting setup for logs, traces and metrics
- Influences peers and project decision-makers to consider the use and application of leading-edge technologies
- Adds to the team culture of diversity, equity, inclusion, and respect
Required qualifications, capabilities, and skills
- Formal training or certification on data architecture concepts and 5+ years applied experience
- Experience delivering data architecture and system designs, data engineer, testing, and operational stability
- Advanced knowledge of architecture, applications, and technical processes with considerable in-depth knowledge in data architecture discipline and solutions (e.g., data modeling, native cloud data services, business intelligence, artificial intelligence, machine learning, data domain driven design, etc.)
- Understanding of database design concepts (Pluggable DB’s) and data modeling for relational or nonrelational databases such as RDBMS (Oracle/Postgres), No SQL (MongoDB, Cassandra) or NewSql (Cockroach DB) and search DB’s like Elasticsearch incorporating multi-master across multi-regions
- Understanding of data design & modeling principles as well as architecture patterns such as data lake, lakehouse, data mart, data fabric and data mesh with experience in Data migrations and Open/Linked data media types such as RDF, Turtle, JSON-LD will be useful
- Experience with deploying to public and/or private cloud ideally with multi-cloud experience, ideally Cloud agnostic (Snowflake or another public cloud)
- Experience in one or more big data processing frameworks such as Spark, Flink, Storm etc. with stream processing experience using Kafka
- Understand the power of AI/ML against appropriate data to drive operational excellence. Experience with Docker and Kubernetes
- Experience in Computer Science, Computer Engineering, Mathematics, or a related technical field
Preferred qualifications, capabilities, and skills
- Proactive, team player, and problem-solver
- Previous experience in handling large-scale systems design. Knowledge of web services, like API, REST, and RPC
- AWS certifications - Developer Associate or any additional professional certifications
- Practical experience in machine learning/AI with Python development a big plus. Knowledge of architecture assessments frameworks, e.g. Architecture Trade off Analysis