The point where experts and best companies meet
Share
Key job responsibilities- Implement data ingestion routines both real time and batch using best practices in data modeling, ETL/ELT processes by leveraging AWS technologies and big data tools.
- Design, implement and operate large-scale, high-volume, high-performance data for analysis and data science.
- Gather business and functional requirements and translate these requirements into robust, scalable, operable solutions with a flexible and adaptable data architecture.
- Bachelors Degree in an engineering or technical field
- 5+ years’ experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools.
- Experience in relational database concepts with a solid knowledge of SQL.
- Strong knowledge of various data warehousing methodologies and data modeling concepts.
- Experience performing various performance tuning activities at the both database level as well as ETL
- Knowledge of professional software engineering practices & best practices for the full software development lifecycle, including coding standards, code reviews, source control management, build processes, testing, and operations
These jobs might be a good fit