Finding the best job has never been easier
Share
As a key member of our team, you will:- Lead the design and implementation of scalable data pipelines and storage solutions.- Drive technical innovation and best practices for data engineering and analytics.
This role requires exceptional analytical thinking, problem-solving skills, and a demonstrated ability to work through ambiguity. You’ll have the opportunity to influence decision-making across WWFBA by providing reliable, actionable insights.If you're eager to work with large-scale global datasets, explore new technologies, and contribute to the future of FBA, we’d love to hear from you. Reach out to schedule an informational chat with the hiring manager.
- Bachelor's degree in computer science, engineering, analytics, mathematics, statistics, IT or equivalent
- 5+ years of data engineering, database engineering, business intelligence or business analytics experience
- 4+ years of experience in developing and operating large-scale ETL/ELT processes, data modeling, and database management.
- 4+ years of coding experience in modern programming or scripting languages (e.g., Python, Scala, Java, C#, PowerShell).
- 4+ years of experience working with relational databases (e.g., Redshift, Oracle, Postgres, SQL Server).
- 4+ years of experience in building and maintaining highly available, distributed systems for large-scale data extraction, ingestion, and processing.
- Advanced SQL skills with a strong focus on query performance optimization.
- Hands-on experience with massively parallel processing (MPP) data technologies (e.g., Redshift, Spark, Hadoop).
- Proficiency with Python-based data analysis libraries (e.g., Pandas/Polars).
- Master’s or PhD in Computer Science, Engineering, Mathematics, or a related field.
- Experience in building data products incrementally and integrating datasets from multiple sources.
- Proven expertise in managing large-scale data environments (e.g., data lakes, lakehouses, or data warehouses).
- Strong SQL query performance tuning skills using Unix/Linux profiling tools.
- Proficiency with AWS services (e.g., SNS, Redshift, RDS, S3, EC2, Athena, Glue, Lambda, SageMaker, EventBridge, CloudWatch Logs, SQS, Route 53).
- Experience with data visualization using BI platforms (e.g., Tableau, Power BI, QuickSight).
These jobs might be a good fit