Job Description
Lead Data Engineer
invent for life
Primary Responsibilities:
- Play a key role in the success and growth of the Data Engineering team by mentoring and playing a leadership role within the team
- Drive innovation within Data Engineering by playing a lead role in technology decisions for the future of our data science, analysis, and reporting needs
- Work with business partners and software engineers to gather, understand, and bridge definitions and requirements
- Lead the design and development for highly complex and critical data projects with stricttimelines
- Improvements to team efficiency and effectiveness through implementation of data tools (self-service, data quality, etc.)
- Design, develop and maintain data pipelines to extract data from a variety of sources and populate data lake and data warehouse
- Develop the various data transformation rules and data modeling capabilities
- Collaborate with Data Analyst, Data Scientists, Machine Learning Engineers to identify and transform data for ingestion, exploration, and modeling
- Work with data governance team and implement data quality checks and maintain data catalogs
- Use Orchestration, logging, and monitoring tools to build resilient pipelines
- Use test driven development methodology when building ELT/ETL pipelines
- Understand and apply concepts like data lake, data warehouse, lake-house, data mesh and data-fabric where relevant
- Develop data models for cloud data warehouses like Redshift and Snowflake
- Develop pipelines to ingest data into cloud data warehouses
- Understand and be able to use different databases like Relational, Document, Graph and Key/Value
- Analyze data using SQL
- Use serverless AWS services like Glue, Lambda, Step Functions
- Use Terraform Code to deploy on AWS
- Containerize Python code using Docker
- Use Git for version control and understand various branching strategies
- Build pipelines to work with large datasets using PySpark
- Develop proof of concepts using Jupyter Notebooks
- Work as part of an agile team
- Create technical documentation as needed
Education:
- Bachelor’s Degree or equivalent experience in a relevant field such as Mathematics, Computer Science, Engineering, Artificial Intelligence, etc.
Required Experience and Skills:
- 9+ years of total experience
- Good experience with AWS services like S3, ECS, Fargate, Glue, StepFunctions, CloudWatch, Lambda, EMR
- SQL
- Proficient in Python, PySpark
- Good with Git, Docker, Terraform
- Ability to work in cross functional teams
Preferred Experience and Skills
- Any AWS developer or architect certification
- Agile development methodology
Current Contingent Workers apply
*A job posting is effective until 11:59:59PM on the dayBEFOREthe listed job posting end date. Please ensure you apply to a job posting no later than the dayBEFOREthe job posting end date.
A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date.