Job Purpose
AIP Suites (Data Modernization to Snowflake) builds an analytics-ready data architecture where data from source systems such as PDM (Product Data Management) and RDO is ingested into Snowflake for centralized storage and modeling. These models support ICE BI, which consumes Snowflake data for analytics and dashboarding. This design ensures clean separation between raw ingestion, transformation, analytics, and service-based consumption, supporting scalable and future-proof data-driven operations.
Responsibilities
- Provides Snowflake-based data warehouse design and development for projects involving new data integration, migration, and enhancement of existing pipelines.
- Designs and develops data transformation logic using SQL, Snowflake stored procedures, and Python-based scripts for ETL/ELT workloads.
- Builds and maintains robust data pipelines to support reporting, analytics, and application data needs.
- Creates and maintains Snowflake objects like tables, views, streams, tasks, file formats, and external stages.
- Participates in project meetings with data engineers, analysts, business users, and product owners to understand and implement technical requirements.
- Writes technical design documentation based on business requirements and data architecture principles.
- Develops and/or reviews unit testing protocols for SQL scripts, procedures, and data pipelines using automation frameworks.
- Completes documentation and procedures for pipeline deployment, operational handover, and monitoring.
- May mentor or guide junior developers and data engineers.
- Stays current with Snowflake features, best practices, and industry trends in cloud data platforms.
- Performs additional related duties as assigned.
Knowledge and Experience
- Bachelor’s Degree or the equivalent combination of education, training, or work experience.
- 5+ years of professional experience in data engineering or database development.
- Strong Hands-on experience:
- Writing complex SQL queries and stored procedures
- Database stored procedures, functions, views, and schema design
- Using Streams, Tasks, Time Travel, and Cloning
- Proficiency in database performance tuning and performance optimization — clustering, warehouse sizing, caching, etc.
- Experience configuring external stages to integrate with cloud storage (AWS S3, Azure Blob, etc.).
- Experience writing Python/Shell scripts for data processing (where needed).
- Knowledge on Snowflake and Tidal is an added advantage
- Proficiency in using Git and working within Agile/Scrum SDLC environments.
- Familiarity working in a Software Development Life Cycle (SDLC) leveraging Agile principles.
- Excellent analytical, decision-making, and problem-solving skills.
- Ability to multitask in a fast-paced environment with a focus on timeliness, documentation, and communication with peers and business users.
- Strong verbal and written communication skills to engage both technical and non-technical audiences at various organizational levels.