Finding the best job has never been easier
Share
Position Description:
You will be part of Honeywell's VE/CE CoE Advanced Tech team. In this role as Sr Advanced Data Engineer, you will design, implement, and manage the data architecture, systems, and processes to effectively collect, store, process and analyze high volume, high dimensional data to provide strategic insight into complex business problems. This will involve creating and maintaining scalable, efficient, and secure data pipelines, data warehouses, and data lakes. You need to ensure consistency in data quality and availability for analysis and reporting including compliance with data governance and security standards.
Key Responsibilities:
Work in complex data science and analytics projects in support of the VECE organization
Work with product owner to identify the data requirements and design/ maintain/ optimize data pipeline to ingest, transform, and load structured and unstructured data from various sources into the data warehouse or data lake
Design and implement data models and schemas to support analytical and reporting requirements
Develop and maintain ETL (Extract, Transform, Load) processes
Administer, optimize, and manage databases, data warehouses, and data lakes to ensure performance, reliability, and scalability
Create and maintain comprehensive documentation for data architecture, processes, and systems
Troubleshoot and resolve data-related problems and optimize system performance
YOU MUST HAVE:
10 or more years of relevant experience in Data Engineering, ETL Development, Database Administration.
Experience in Snowflake, Oracle, Big Query
Experience in Data Modelling techniques including schema design for both rational and NoSQL databases
Experience in Azure Databricks, CI/CD & Dev Ops Process
Experience in Google Cloud, Azure, CI/CD & Dev Ops Process
Expert in scripting and querying languages, such as Python, SQL, PySpark
Experience with both Structured and Unstructured data
Knowledge of Agile development methodology
WE VALUE
Working with at least one NoSQL system (HBase, Cassandra, MongoDB)
Knowledge of databases, data warehouse platforms (Big Query, Snowflake) and Cloud based tools.
Experience in using data integration tools for ETL processes.
Knowledge in cutting-edge packages such as SciKit, TensorFlow, Pytorch, GPT, PySpark, Bit bucket etc.
Proven mentoring ability to drive results and technical growth in peers.
Additional InformationThese jobs might be a good fit