Finding the best job has never been easier
Share
Key job responsibilities
• Design, implement, and support a platform providing secured access to large datasets.
• Interface with tax, finance and accounting customers, gathering requirements and delivering complete BI solutions.
• Model data and metadata to support ad-hoc and pre-built reporting.
• Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions.
• Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation.
• Tune application and query performance using profiling tools and SQL.
• Analyze and solve problems at their root, stepping back to understand the broader context.
• Learn and understand a broad range of Amazon’s data resources and know when, how, and which to use and which not to use.
• Keep up to date with advances in big data technologies and run pilots to design the data architecture to scale with the increased data volume using AWS.
• Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for datasets.
• Triage many possible courses of action in a high-ambiguity environment, making use of both quantitative analysis and business judgment.
- Experience with SQL
- 1+ years of data engineering experience
- Experience with data modeling, warehousing and building ETL pipelines
- Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala)
- Experience with one or more scripting language (e.g., Python, KornShell)
- Experience with big data technologies such as: Hadoop, Hive, Spark, EMR
- Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc.
These jobs might be a good fit