The point where experts and best companies meet
Share
Key job responsibilities
1. Design, implement and support an analytical data infrastructure using AWS technologies
2. Build robust and scalable data integration (ETL) pipelines using SQL, and AWS data storage technologies like Aurora, Red Shift etc.
3. Design and develop Analytics applications using modern scripting languages (Python, R, PHP, etc) supporting critical business functions.
3. Gather business and functional requirements and translate these requirements into robust, scalable, operable solutions with a flexible and adaptable data architecture.
4. Lead architecture design and implementation of next generation BI solution
5. Continually improve ongoing reporting and analysis processes, automating or simplifying self-service modeling and production support for customers.
- 1+ years of data engineering experience
- Experience with data modeling, warehousing and building ETL pipelines
- Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala)
- Experience with one or more scripting language (e.g., Python, KornShell)
- Experience with big data technologies such as: Hadoop, Hive, Spark, EMR
- Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc.
These jobs might be a good fit