Success Indicators
In the short term, success means delivering reliable, high-performance data pipelines and ensuring data quality across the product. Long-term, you'll be instrumental in optimizing workflows, enabling self-serve analytics platforms, and supporting strategic decisions through impactful data solutions.
Your work will directly fuel business decisions, improve data accessibility and reliability, and contribute to the team's ability to handle massive-scale data challenges. You'll help shape the future of data engineering within a global, fast-paced environment.
Benefits and Opportunities
What you'll be doing - Designing and developing scalable data pipelines and ETL processes to process massive amounts of structured and unstructured data.
- Collaborating with cross-functional teams (data science, finance, analytics, and R&D) to deliver actionable data solutions tailored to their needs.
- Building and maintaining tools and frameworks to monitor and improve data quality across the product.
- Providing tools and insights that empower product teams with real-time analytics and data-driven decision-making capabilities.
- Optimizing data workflows and architectures for performance, scalability, and cost efficiency using cutting-edge technologies like Apache Spark and Flink.
What we're looking for - 4+ yeasrs of experience as a Data Engineer
- Expertise in designing and developing scalable data pipelines, ETL processes, and data architectures.
- Proficiency in Python and SQL, with hands-on experience in big data technologies like Apache Spark and Hadoop.
- Advanced knowledge of cloud platforms (AWS, Azure, or GCP) and their associated data services.
- Experience working with Imply and Apache Druid for real-time analytics and query optimization.
- Strong analytical skills and ability to quickly learn and adapt to new technologies and tools.
You might also have - Hands-on experience with stream-processing frameworks like Apache Flink and Kafka for real-time data integration and analytics.
- Knowledge of functional programming concepts, particularly using Scala.
- Familiarity with data visualization tools like Tableau or Power BI for creating impactful dashboards.
- Experience with machine learning frameworks or building ML pipelines and MLOps workflows.
- Previous exposure to ad-tech data solutions or working within ad-serving ecosystems.
Additional information - Relocation support is not available for this position.
- Work visa/immigration sponsorship is not available for this position
This position requires the incumbent to have a sufficient knowledge of English to have professional verbal and written exchanges in this language since the performance of the duties related to this position requires frequent and regular communication with colleagues and partners located worldwide and whose common language is English.