Finding the best job has never been easier
Share
- Develop and improve the current data architecture using AWS Redshift, AWS S3, AWS Aurora (Postgres) and Hadoop/EMR.
- Improve upon the data ingestion models, ETL jobs, and alarming to maintain data integrity and data availability.
- Stay up-to-date with advances in data persistence and big data technologies and run pilots to design the data architecture to scale with the increased data sets of advertiser experience.
- 7+ years of data engineering experience
- Bachelor's degree in computer science, engineering, analytics, mathematics, statistics, IT or equivalent
- Knowledge of distributed systems as it pertains to data storage and computing
- Experience with data modeling, warehousing and building ETL pipelines
- Knowledge of professional software engineering & best practices for full software development life cycle, including coding standards, software architectures, code reviews, source control management, continuous deployments, testing, and operational excellence
- Experience with Big Data Technologies like Spark, Hadoop, Hive, Presto, etc., and with AWS technologies like EMR, Redshift, S3, AWS Glue, Kinesis, FireHose, Lambda, VPC and IAM Roles
These jobs might be a good fit