Your Role and Responsibilities
- Implementing and validating data pipelines on Snowflake cloud environment using Python and handling multiple kind of dataframes (Snowflake, Pandas, Spark).
- Receiveing and writing data using SQL querys from multipe databases (Snowflake, Oracle, Redshift) using multiple sql dialects (SnowSQL, OracleSQL, PostgreSQL).
- Working with data file inputs and outputs (excel, json, ini …), being able to create python functions based on an expected outcome.
Required Technical and Professional Expertise
- Understanding database schemas like Snowflake schema and STAR schema, and being able to create ETL-s between them, using SQL stored procedures, python, Innformatica PowerCenter.
- Being able to receive/ write data into a AWS cloud environment in Redshift/ S3 bucket.
- Working in an Agile collaborative environment, partnering with other data engineers, and business intelligence experts from client side.