The point where experts and best companies meet
Share
Key job responsibilities- Interview stakeholders to gather requirements and translate them into robust solutions that work well within the overall data architecture
- Monitor and troubleshoot operational or data issues in the data pipelines
- Identify and recommend opportunities to automate systems and tools
- Bachelor's degree in engineering, statistics, computer science, mathematics, or a related quantitative field
- 1+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience
- Experience with data querying or modeling with SQL
- Experience with data visualization using Tableau, Quicksight, or similar tools
- Experience with writing and optimizing SQL queries in a business environment with large-scale, complex datasets
- Master's degree
- Knowledge of data modeling and data pipeline design
- Experience gathering business requirements, using industry standard business intelligence tool(s) to extract data, formulate metrics and build reports
- Experience with statistical methods like t-test or Chi-square
- Experience with scripting language (e.g., Python, Java, or R)
- Knowledge of data warehouse architecture, OLAP, and reporting/analytics environments
- Experience with performing regression analysis, building ML classification models, and prompt engineering with Gen AI
- Experience with AWS technologies including Redshift, RDS, S3, EMR
- Strong communication and presentation skills necessary to build effective working relationships and positively influence decision making
These jobs might be a good fit