Description/Brief Summary of what role entails (Job Purpose)
This role requires experience in Big Data and Enterprise Datawarehouse infrastructure, as well adopting appropriate emerging technologies and its solutions to answer business problems and identify opportunities impacting the revenue or operating income. Key Tasks and Responsibilities
Design and build publication-ready data pipelines using diverse sets of structured and unstructured data.
Ensure data pipelines are created using credible qualitative and quantitative methodologies based on key insights. Pays high attention to data accuracy. In depth understanding of data identification, collection, processing, and analysis methodologies. Actively consult, conduct pre-development workshops, develop POCs, Lead incubation.
Coach and mentor Data Engineers / Specialists, leading by example through acting as an individual contributor
Qualifications:
- Bachelor’s in engineering / Technology degree or equivalent
- Specialized in Computer Science, Software Engineering / Information Technology or related field is preferred
- Certifications in Microsoft Certified Solutions Associate (MCSA) / Azure Data Solution Certification / AWS Solution
- Architect Certification / Snowflake certification is desirable
- Azure / Databricks / Scala / Python / and Visualization techniques Certification is preferred
Experience:
3+ years of Software development with 4+ years of experience working with data engineering. - 2+ years of experience with cloud/on premises data warehouses and data modeling.
- 2+ years of hands-on experience in creating technical solutions on Cloud - Snowflake EDW / Informatica IICS / HVR as well as design and development on SQL Server is advantageous
- Demonstrated experience in progressively challenging and Responsible roles. Must have experience working in Matrix organization structure.
Essential skills: - Experience working with any of the data engineering skills like No SQL, ASW, Kafka, Hadoop et.
- Experience working with cloud data warehouses like Redshift, Azure SQL DW, Snowflake, Google Big Query.
- Excellent Data crunching and presentation/ interpretation skills- Story building.
- Working knowledge of cloud / Big data Platform like Azure or AWS, exposure to Honeywell Forge – Big data platform
- Must be a good communicator of vision, ideas, and work requirements
- Must be an innovative and integrative thinker quick to visualize solutions and generate ideas
- Must be able to mentor / coach / guide experienced technology specialists.
- Strong business acumen, with ability to draw clear connections between data modeling activities and business challenges/opportunitiesTeam Player, open to feedback, and flexible & adoptable to changing needs.
Desired skills: - Knowledge in Elasticsearch Kibana, REST APIs.
- Experience with the agile and DevOps methodologies
- Developing IoT Connectivity Solutions using Azure event hub/Apache Kafka.
Additional Information - JOB ID: HRD241476
- Category: Integrated Supply Chain
- Location: Devarabisanahalli Village, KR Varturhobli,,East Taluk - Phase I,Bangalore,KARNATAKA,560103,India
- Exempt