The point where experts and best companies meet
Share
Key job responsibilities
Design, build and implement the right ETL processes using AWS and similar technologies.Implement anomaly detection systems to have a proactive approach to any potential data quality issues, using industry standard frameworks.Enable large scale analytics using EMR and other big data technologiesEstablishing and implementing technology best practices that should be followed across the organization.Work on proof of concepts for adoption of new technology and tools
- 1+ years of data engineering experience
- Experience with data modeling, warehousing and building ETL pipelines
- Experience with SQL
- Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala)
- Experience with one or more scripting language (e.g., Python, KornShell)
- Knowledge of AWS Infrastructure
- Knowledge of software engineering best practices across the development life cycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations
- Experience with big data technologies such as: Hadoop, Hive, Spark, EMR
- Experience building data pipelines or automated ETL processes
- Experience writing and optimizing SQL queries with large-scale, complex datasets
These jobs might be a good fit