Share
Key job responsibilities
Key job responsibilities
- Design and develop the pipelines required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Python and AWS big data technologies.
- Oversee and continually improve production operations, including optimizing data delivery, re-designing infrastructure for greater scalability, code deployments, bug fixes and overall release management and coordination.
- Establish and maintain best practices for the design, development and support of data integration solutions, including documentation.- Able to read, write, and debug data processing and orchestration code written Python/Scala etc following best coding standards (e.g. version controlled, code reviewed, etc.)
- 3+ years of data engineering experience
- Experience with data modeling, warehousing and building ETL pipelines
- Experience with SQL
These jobs might be a good fit