Expoint - all jobs in one place

מציאת משרת הייטק בחברות הטובות ביותר מעולם לא הייתה קלה יותר

Limitless High-tech career opportunities - Expoint

Vimeo Senior Data Engineer 
India, Karnataka, Bengaluru 
251606265

25.06.2024

Our platform has robust analytics tools that can tell video creators where people are from, where they click, and even track where in a video someone stopped watching—giving you insight into creating engaging content in the future.

While working as a peer to both technical and non-technical staff throughout the company, you will drive the improvement process of our products and business operations. You will help define data access and discoverability requirements and work to create infrastructure and services that provide access to event streams.

What you'll do:

  • Design, develop and maintain Vimeo’s product and event analytics platform that processes billions of records every day in real-time to enable analytics and event-driven systems
  • Partner with analytics leaders and analysts to ensure adherence to data governance and data modeling best practices
  • Partner with product engineering teams to enhance the efficiency of product analytics clients and data collection mechanisms
  • Contribute software designs, code, tooling, testing, and operational support to a multi-terabyte analytics platform
  • Provide technical leadership to the data engineering team and actively lead design discussions
  • Constantly monitor our data platform and make recommendations to enhance system architecture
  • Work collaboratively with other data engineers, analysts, and business stakeholders to understand and plan technical requirements for projects
  • Prioritize project intake, perform cost/benefit analysis, and make decisions about what work to pursue that best balances our platform’s users, stakeholders, and technology

Skills and knowledge you should possess:

  • 5+ years of engineering experience in a fast-paced environment; 2+ years of experience in scalable data architecture, fault-tolerant ETL, and monitoring of data quality in the cloud
  • Deep understanding of distributed data processing architecture and tools such as Kafka and Spark
  • Working knowledge of design patterns and coding best practices
  • Experience with and understanding of data modeling concepts, techniques, and best practices
  • Experience with Airflow, Celery, or other Python-based task-processing systems
  • Proficiency in SQL
  • Proficiency in Python or Java
  • Proficiency with modern source control systems, especially Git
  • Experience working with non-technical business stakeholders on technical projects

Bonus points (nice skills to have, but not needed):

  • Cloud-based DevOps: AWS or Google Cloud Platform or Azure
  • Experience with Amplitude, Snowplow, Segment, or other event analytics platforms
  • Relational database design
  • Snowflake or other distributed columnar-store databases
  • Basic Linux/Unix system administration skills
  • Familiarity with containerization technologies (e.g., Docker, Kubernetes)#LI-MM1