Manage and optimize big data and analytics environments(e.g., Hadoop, Redshift, BigQuery, etc.) to ensure efficient data storage and retrieval.
Develop automation scripts and tools using Python or Ruby to streamline data operations and workflows.
Oversee the configuration, deployment, and management of Linux servers to support data operations.
Utilize infrastructure provisioning tools like Chef to automate the setup and maintenance of data systems.
Monitor data systems performance, troubleshoot issues, and implement solutions to enhance reliability and efficiency.
Collaborate with data engineers, data scientists, and other stakeholders to understand data requirements and deliver solutions that meet business needs.
Ensure data security and compliance with relevant regulations and best practices.
Provide mentorship and guidance to junior team members, fostering a culture of continuous learning and improvement.
What you’ll bring
GCP Big Query extensive experience.
working with at least one big data environment (e.g., Hadoop, Druid, Vertica, Redshift, BigQuery, etc.).
Strong expertise in Linux system administration and troubleshooting.
Hands-on experience with infrastructure provisioning tools like Chef and scripting languages (Python, Ruby) - an advantage
Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes) is a plus.
Excellent problem-solving skills and the ability to work independently or as part of a team.
Strong communication skills and the ability to collaborate effectively with cross-functional teams.
Proven track record of managing complex data environments and delivering high-quality solutions on time.
More than snacks!
Office and home hybrid working
Expand your toolbox with our internal learning tools