Expoint - all jobs in one place

מציאת משרת הייטק בחברות הטובות ביותר מעולם לא הייתה קלה יותר

Limitless High-tech career opportunities - Expoint

Amazon Business Intel Engineer DISCO CAST 
India, Tamil Nadu 
833351003

01.12.2024
DESCRIPTION

Key job responsibilities
• Design, development and ongoing operations of scalable, performant data warehouse (Redshift) tables, data pipelines, reports, dashboards and data transformation strategies to manage a large volume of data.
• Development of moderately to highly complex data processing jobs using appropriate technologies (e.g. SQL, Python, Spark, AWS Lambda, etc.)
• Development of dashboards and reports.
• Collaborate with internal stakeholders to understand business requirements and provide data driven solutions.
• Develop complex SQL queries and optimize performance of SQL queries on large data.
• Build and manage dashboards, scorecards, and other data visualization using BI tools such as Tableau, Power BI, or AWS Quicksight.
• Monitor Tableau and QS dashboards to ensure latest data is available, address any technical issues in dashboard refresh (fixing failed jobs, re-running jobs/pipelines, tracking upstream dependencies), coordinate to resolve technical issues, inform Stakeholders in case of any unavoidable delays, enhancing existing dashboards based on low-level specifications.
• Conduct thorough data analysis and troubleshoot data integrity issues, providing gap analysis and business solutions.
• Actively manage the timeline and deliverables of projects, anticipate risks and resolve issues.
• Adopt Business Intelligence best practices in reporting and analysis.
• Monitor WBR jobs, address any issues (fixing failed jobs, re-running jobs/scripts, tracking upstream dependencies) co-ordinate for any technical issues, and gathering inputs to update commentary for the metric fluctuations.
• Repeating an existing analysis for new scenarios (geographies, tier, device types) based on existing analyses, pulling datasets using low-level specifications
• Monitoring Datanet & Cradle based pipelines and data pipelines that feed metrics into APT (Weblab Analysis Tool) to address failures & delays, coordinate for resolution of technical issues, creating & monitoring alarms/checks on the pipelines for tracking delays & ensuring data quality, creating pipelines (or updating existing pipelines) based on low-level specifications and running & monitoring back-fill jobs to generate historical datasets based on already existing pipeline jobs, Coordinating with APT team to onboard (or update) APT metrics.

BASIC QUALIFICATIONS

- 2+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience
- Experience with data visualization using Tableau, Quicksight, or similar tools
- Experience with scripting language (e.g., Python, Java, or R)
- Knowledge of data modeling and data pipeline design
- Experience gathering business requirements, using industry standard business intelligence tool(s) to extract data, formulate metrics and build reports
- Bachelor's degree in BI, finance, engineering, statistics, computer science, mathematics, finance or equivalent quantitative field
- Knowledge of AWS products such as Redshift, Quicksight, and Lambda.
- Excellent verbal/written communication & data presentation skills, including ability to succinctly summarize key findings and effectively communicate with both business and technical teams.


PREFERRED QUALIFICATIONS

- Experience in the data/BI space
- Experience with building and maintain basic data artifacts (e.g. ETL, data models, queries)
- Experience with data-specific programming languages/packages such as R or Python Pandas.
- Experience with AWS solutions such as EC2, DynamoDB, S3, and EMR.
- Knowledge of machine learning techniques and concepts.