Finding the best job has never been easier
Share
Key job responsibilities
Design, implement, and support a platform providing secured access to large datasets.
Interface with tax, finance and accounting customers, gathering requirements and delivering complete BI solutions.
Model data and metadata to support ad-hoc and pre-built reporting.Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation.
Tune application and query performance using profiling tools and SQL.
Analyze and solve problems at their root, stepping back to understand the broader context.Keep up to date with advances in big data technologies and run pilots to design the data architecture to scale with the increased data volume using AWS.
Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for datasets.
- - 3+ years of data engineering experience
- - Experience with data modeling, warehousing and building ETL pipelines
- - Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala)
- - Experience with one or more scripting language (e.g., Python, KornShell)
- - Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions
- - Experience with data visualization software (e.g., AWS QuickSight or Tableau) or open-source project
- - Bachelor's degree, or Master's degree
- 5+ years of data engineering experience
- Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions
- Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases)
These jobs might be a good fit