Job responsibilities
- Play a pivotal role in shaping the our bank's data strategy
- Manage a team of managers and individual contributors
- Lead a talented team of engineers, partner with cross-functional stakeholders, and own the roadmap for building a unified, scalable, secure, and reliable data platform to enable advanced analytics, machine learning, and real-time decision-making
- Define and execute the vision, strategy, and roadmap for the data platform in alignment with the company’s business objectives
- Drive innovation in the data platform space, ensuring scalability, flexibility, and future-proof architecture
- Act as a thought leader and trusted advisor to executive leadership, providing strategic recommendations on data initiatives and solutions
- Architect and oversee the development of a next-generation data platform leveraging cutting-edge technologies
- Ensure the platform supports complex use cases, including real-time streaming, big data processing, and advanced machine learning
- Creating the right team processes, structure, and systems to build and maintain high-quality code that powers our data infrastructure
- Promoting data quality, integrity, and security across the organization
Required qualifications, capabilities, and skills
- Formal training or certification on relevant job skill concepts and 10+ years applied experience. In addition, 5+ years of experience leading technologists to manage, anticipate and solve complex technical items within your domain of expertise and more broadly across the organization.
- Proven track record of building and scaling data platforms in high-growth or enterprise environments
- Hands-on experience and expertise in data manipulation using tools such as Python, SQL Spark, and Databricks,
- Deep understanding of distributed systems, cloud-native architectures, and modern data technologies (e.g., Spark, Kafka, Databricks, Kubernetes, Confluent)
- Experience in cloud platforms such as AWS, GCP, or Azure
- Knowledgeable in Data Lakes, Warehouses and Feature Stores (i.e. AWS Lake Formation, Feast, Databricks, Snowflake)
- Strong proficiency in programming languages (e.g., Python, Java) and infrastructure as code (e.g., Terraform, CloudFormation)
- Hands-on experience with data pipeline ETL and orchestration tools (e.g., Airflow, Glue) and real-time processing frameworks
- Exceptional ability to communicate complex technical concepts to both technical and non-technical audiences
- Expertise in Computer Science, Computer Engineering, Mathematics, or a related technical field
Preferred qualifications, capabilities, and skills
- Experience working at code level