The point where experts and best companies meet
Share
This role will interact extensively with cross-functional groups such as database engineers, software developers, business intelligence engineers, business analysts, and business leaders to capture requirements, build databases, design and vend reports, and maintain the supporting infrastructure. Successful candidates will have experience with AWS technologies, ETL mechanisms, and database design.Key job responsibilities
- Design, develop, and maintain the infrastructure and processes to support global data services programs.- Gather, backlog, and prioritize requirements from business partners.
- Identify key performance indicators and actionable insights
- Write high-quality SQL queries to retrieve and analyze data.
- Write high-quality scripts to transform, test, and access data.- Provide coaching to colleagues in best practices.A day in the life
The role needs to ensure that the data we vend is accessible to all of our customers by reviewing system health metrics and the status of the various automated jobs. They'll then check for any tickets that came in for ad-hoc operational requests before moving on to the project work assigned for that sprint. Typically they'll need to reach out to several stakeholders to discuss requirements and iterate on solutions. After project work has completed, they'll update their tracking tickets and evaluate the backlog for the next priority. Once a week or so they'll take some time to explore new technologies and recommend how they can be integrated into the program.
1. Medical, Dental, and Vision Coverage
2. Maternity and Parental Leave Options
3. Paid Time Off (PTO)
4. 401(k) Plan
- 5+ years of data engineering experience
- Experience with data modeling, warehousing and building ETL pipelines
- Experience with SQL
- Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS
- Experience mentoring team members on best practices
- Experience with big data technologies such as: Hadoop, Hive, Spark, EMR
- Experience operating large data warehouses
- Knowledge of professional software engineering & best practices for full software development life cycle, including coding standards, software architectures, code reviews, source control management, continuous deployments, testing, and operational excellence
These jobs might be a good fit