Expoint - all jobs in one place

Finding the best job has never been easier

Limitless High-tech career opportunities - Expoint

Snowflake SENIOR DATA ENGINEER 
India, Maharashtra, Pune 
596336474

10.09.2024

Key Responsibilities:

  • Design and Develop Data Platform Tools: Collaborate with cross-functional teams to design, develop, and maintain data platform tools and infrastructure components using cutting-edge technologies such as Kubernetes, Docker, and Airflow.
  • Optimize Data Pipelines: Implement efficient data pipelines to ingest, transform, and load data into Snowflake using SQL and Python, ensuring high performance and reliability.
  • Monitor and Maintain Data Platform: Monitor the health and performance of data platform tools and infrastructure, proactively identifying and resolving issues to ensure maximum uptime and reliability.
  • Automate Deployment Processes: Implement automation scripts and tools to streamline the deployment and configuration of data platform components using Kubernetes and Docker.
  • Collaborate with Data Teams: Work closely with data engineering, data science, and analytics teams to understand their requirements and provide support in leveraging data platform tools effectively.
  • Ensure Security and Compliance: Implement security best practices and ensure compliance with data governance policies and regulations across all data platform components.
  • Documentation and Knowledge Sharing: Document technical designs, procedures, and best practices, and actively participate in knowledge sharing sessions to facilitate cross-team collaboration and learning.
  • Product Mindset: A strong focus on delivering high-quality products that meet the needs of internal stakeholders and customers.
  • Fast-Paced Environment: Ability to thrive in a fast-paced, dynamic environment.

Qualifications:

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 6-8 years of experience in data engineering or related roles.
  • Strong proficiency in SQL and Python for data manipulation and scripting.
  • Experience with containerization technologies such as Docker and container orchestration platforms like Kubernetes.
  • Experience with workflow management tools like Airflow for orchestrating complex data pipelines.
  • Hands-on experience with Snowflake or other cloud data warehouse platforms is highly desirable.
  • Strong problem-solving skills and the ability to troubleshoot complex issues in a distributed, cloud-based environment.
  • Excellent communication and collaboration skills, with the ability to work effectively in a cross-functional team environment.