The future is what you make it.
As a Lead AI Data Analyst/Engineer you will be supporting the iterative development of data science projects by identifying promising data sources, wrangling the data into features for machine learning and statistical analysis, refining features until a suitable model is created, and supporting our enterprise IT organization in deploying data pipelines. You will support AI projects across a spectrum of Honeywell Aerospace functions including supply chain management, manufacturing operations, customer excellence, and financial forecasting. You will be instrumental in leading the acceleration of the AI CI/CD cycle by developing data engineering processes to more efficiently execute data science projects.
YOU MUST HAVE
- 6+ years of data engineering experience in any data warehousing platform.
- Expert data warehousing concepts.
- Expert in SQL.
- Expert in ETL
- Python programming experience.
- Experience in developing end to end data pipelines i.e., Ingestion, transformation, storage.
WE VALUE
- Bachelor's degree in STEM field
- Should have developed and deployed complex big data ingestion jobs bringing AI prototypes to production on DataBricks and Snowflake (or any data warehouse platform).
- Experience in building advanced analytics solutions with data from enterprise systems like ERPs (SAP Hana), CRMs, Marketing tools, etc.
- Experience working with ambiguous AI data engineering requirements in the areas of supply chain, manufacturing, customer excellence, and finance.
- Experience with machine learning solutions and data science methods promotion.
- Have worked in Snowflake and DataBricks data platform in designing/implementing the data models and SQL based transformations.
- Have experience in developing and building applications to process very large amounts of data (structured and unstructured), including streaming real-time data.
- Experience in writing complex SQL statements and debugging/improving performance of SQL statements using query profilers.
- Experience in working with cloud-based deployments. Understanding of containers (Docker) & container orchestration (Swarm or Kubernetes).
- Good understanding of branching, build, deployment, CI/CD methodologies such as GitHub, Octopus and Bamboo
- Experience working with in Agile Methodologies and Scrum Knowledge of software best practices, like Test-Driven Development (TDD)
- Effective communication skills and succinct articulation
- Experience with dimensional modeling, data warehousing and data mining.
- Database performance management and API development
- Technology upgrade oversight
- Experience with visualization software, Tableau preferred but not necessary.
- Understanding of best-in-class model and data configuration and development processes
- Experience working with remote and global teams and cross team collaboration.
- Consistently makes timely decisions even in the face of complexity, balancing systematic analysis with decisiveness.
Additional Information - JOB ID: HRD235491
- Category: Data & Analytics
- Location: 1944 E Sky Harbor Circle,Phoenix,Arizona,85034,United States
- Must be a US Person or able to obtain export Authorization.