Finding the best job has never been easier
Share
Key job responsibilities
You are experienced in building efficient and scalable data services and have the ability to integrate data systems with AWS tools and services to support a variety of customer use cases/applications.
Design, implement and operate large-scale, high-volume, high-performance data structures for analytics and data science.
Implement data ingestion routines using best practices in data modeling, ETL/ELT processes by leveraging AWS technologies and big data tools.
Gather business and functional requirements and translate these requirements into robust, scalable, operable solutions with a flexible and adaptable data architecture.Identify opportunities in existing data solutions for improvements
- Bachelor's degree
- 1+ years of data engineering experience
- Experience with data modeling, warehousing and building ETL pipelines
- Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala)
- Experience with one or more scripting language (e.g., Python, KornShell)
- Knowledge of writing and optimizing SQL queries in a business environment with large-scale, complex datasets
- Experience with big data technologies such as: Hadoop, Hive, Spark, EMR
- Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc.
These jobs might be a good fit