Essential Responsibilities:
Manage and optimize Big Data and cloud technologies to ensure reliable and efficient operations.
Design, implement, and oversee scalable compute infrastructure tailored to support data scientists' needs.
Create insightful reports leveraging tools such as Tableau, MicroStrategy, ThoughtSpot, Qlik, Looker, and others.
Optimize system performance through tuning and proactive efficiency improvements.
Collaborate with business users to address queries, resolve issues, and provide technical guidance.
Lead efforts in data migration, system upgrades, and maintenance of databases and associated tools.
Develop and maintain automation scripts for platform administration and monitoring tasks.
Deliver database engineering, infrastructure builds, and release management services with a focus on quality and reliability.
Partner with cross-functional operations teams to handle incidents, perform root cause analysis, and resolve problems effectively.
Take ownership of implementing modules, applications, or products from concept to deployment.
Minimum Qualifications:
Minimum of 5 years of relevant work experience and a Bachelor's degree or equivalent experience.
Preferred Qualification:
9+ years of enterprise-grade database technologies and large-scale data platforms
Proven experience designing mission-critical data infrastructure at petabyte scale
Deep expertise in cloud computing architectures (AWS, Azure, GCP)
Advanced proficiency in automation and scripting languages (Python, Bash)
Demonstrated leadership in cross-functional collaboration and high-impact delivery
Experience with generative AI tools for database optimization preferred
Our Benefits:
Any general requests for consideration of your skills, please
משרות נוספות שיכולות לעניין אותך