In this role, you will:
- Lead complex initiatives with broad impact and act as key participant in large scale software planning for the Technology area
- Design, develop, and run tooling to discover problems in data and applications and report the issues to engineering and product leadership
- Review and analyze complex software enhancement initiatives for business, operational or technical improvements that require in depth evaluation of multiple factors including intangibles or unprecedented factors
- Make decisions in complex and multi-faceted data engineering situations requiring understanding of software package options and programming language and compliance requirements that influence and lead Technology to meet deliverables and drive organizational change
- Strategically collaborate and consult with internal partners to resolve highly risky data engineering challenges
Required Qualifications:
- 5+ years of Database Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education
Desired Qualifications:
- A bachelor’s degree or higher in computer science
- 5+ years of software engineering experience – Default Required
- 5+ years of experience working in Spark, Hadoop and Big Data experience
- 5+ years of experience working on Spark SQL, Streaming and data frame/dataset API experience
- 3+ years of experience working on Spark query tuning and performance optimization
- Deep understanding of Hadoop / Cloud platforms, HDFS, ETL/ELT process and Unix shell scripting
- 5+ years of experience working with Relationship Database Management Systems (RDBMS) such as SQL Server, Oracle or MySQL
- 3+ years of experience with SQL & NOSQL database integration with Spark (MS SQL server and MongoDB)
- 2+ years of Agile experience
- Deep understanding of distributed systems (CAP theorem, partition and bucketing, replication memory layouts, consistency) 3+ years of microservices development experience
- 3+ years of experience building cloud-ready enterprise solutions in one or a combination of the following: Amazon Web Services (AWS), Google Cloud Platform (GCP) or Pivotal Cloud Foundry (PCF)
- 2+ years of experience with Apache Kafka or Confluent Enterprise
Job Expectations:
- Ability to work effectively, as well as independently, in a team environment
17 Mar 2025
Wells Fargo Recruitment and Hiring Requirements:
b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.