Job responsibilities
- Collaborates closely with cross-functional teams to develop efficient data pipelines to support various data-driven initiatives
- Implements best practices for data engineering, ensuring data quality, reliability, and performance
- Contributes to data modernization efforts by leveraging cloud solutions and optimizing data processing workflows
- Performs data extraction and implement complex data transformation logic to meet business requirements
- Leverages advanced analytical skills to improve data pipelines and ensure data delivery is consistent across projects
- Monitors and executes data quality checks to proactively identify and address anomalies
- Ensures data availability and accuracy for analytical purposes
- Identifies opportunities for process automation within data engineering workflows
- Communicates technical concepts to both technical and non-technical stakeholders
Required qualifications, capabilities, and skills
- Formal training or certification on software engineering* concepts and 3+ years applied experience
- 4+ years Data Engineering experience in building and optimizing data pipelines, architectures, and data sets
- Proficiency in object-oriented/object function scripting languages: Python etc.
- Advanced working SQL knowledge and experience working with relational databases, as well as working familiarity with a variety of databases
- Working understanding of NoSQL databases
- Experience in developing ETL process and workflows for streaming data from heterogeneous data sources
- Willingness and ability to learn and pick up new skillsets
Preferred qualifications, capabilities, and skills
- Experience with data pipeline and workflow management tools: Airflow etc.
- Experience working with modern DataLakes: Snowflake, Databricks etc.
- Knowledge of data engineering practices, cloud platforms, automation, and CI/CD pipeline development