Bachelor’s degree or foreign equivalent in Computer Engineering or related field and 5 years of progressive, post-baccalaureate experience in the job offered or related occupation.
5 years of experience with each of the following skills is required:
Using structured query language (SQL) to build the data pipelines for visualizations and translate the business challenges into technical layer
Using Python to orchestrate and schedule automation projects, including implementing all data connection, retrieving the data and transforming the data into the business friendly layer.
Using Teradata to build procedures and process for building visualizations data layer and automation
Using extract, transform, load (ETL) to architect and build data pipelines from raw data sources, cleanse the data and build semantic views for visualizations and analytics.
Unix Shell scripting for scheduling and setting priority for the jobs
2 years of experience with each of the following skills is required:
Using Tableau to visualize all the campaign performance and engineering specific KPIs for measurement and insights, building multi level grouping and trend analysis.
Using Snowflake to build visualizations pipelines and automation process in cloud based environment