The point where experts and best companies meet
Share
About the Role
As a Senior Data Engineer, you will build and maintain data infrastructure within the Data Engineering team's data platform. You’ll be responsible for managing containerized applications and supporting our product event streaming systems in the cloud as you partner with high impact teams such as AI and Product Analytics. You will also build data pipelines using tools such as Airflow, Python, ECS and Snowflake. The Senior Data Engineer will also be expected to drive and mature best practices such as IAC (Terraform) and DevOps within the Data Engineering team.
In this role, you will:
Build and maintain hosted environments for key tools with AWS ECS/EKS
Manage cloud environments for the data engineering team through Terraform
Support our product events data architecture including Kafka, SQS, EKS &, S3
Develop data pipelines with Fivetran, Snowflake & Airflow using languages such as Python, SQL & Javascript
Provide thought leadership and contribute to the vision of our data engineering function
Participate in team processes such as on-call rotations, bug triage, technical direction, standards, and execution
Own delivery architecture/execution of major component(s) from conception to release
Looked up to for technical mentorship within the data engineering team. Make others better through code reviews, focus on documentation, and technical guidance
Understand the tradeoffs between technical and business needs, interact and negotiate with key stakeholders, and deliver solutions that take all of these needs into account
Regularly take complex designs / codebases and simplify them without being asked
Work closely with leadership to drive adoption of the latest DevOps and DataOps trends and technologies.
The skills you’ll bring include:
5+ years of hands-on software engineering experience
4+ years working with a major cloud provider (preferably AWS); Experience building/maintaining VPC’s and deploying code using Terraform is a must!
3+ years building and maintaining hosted environments and applications using EKS or another container service is a must!
5+ years of experience in at least one programming language such as Python, Java, Scala is required (Python is our most commonly used language); Advanced SQL expertise is required
Experience working in a modern lakehouse is required (Snowflake is preferred); Modern warehousing best practices should be second nature
Cloud experience is required (AWS is strongly preferred); Terraform is highly preferred
Knowledge and ideally hands on experience working with container services is required (ECS, Kubernetes, etc)
Experience working in a mature SDLC environment (ie: CICD) is required
Modern tech stack experience is a plus (dbt, Fivetran, Snowflake. Airflow)
Experience as a leader within a data engineering team and ability to mentor teammates
Strong work ethic, resiliency, persistence, and urgency; Data Engineering holds itself to a high standard so you’ll need to keep up!
Sharp business and interpersonal skills; ability to influence at senior levels across business units to drive change and achieve common goals
BS or MS in Computer Science, Analytics, Statistics, Informatics, Information Systems or
another quantitative field or equivalent experience
These jobs might be a good fit