In this role, you will:
- Consult with business line and enterprise functions on less complex research
- Use functional knowledge to assist in non-model quantitative tools that support strategic decision making
- Perform analysis of findings and trends using statistical analysis and document process
- Present recommendations to increase revenue, reduce expense, maximize operational efficiency, quality, and compliance
- Identify and define business requirements and translate data and business needs into research and recommendations to improve efficiency
- Participate in all group technology efforts including design and implementation of database structures, analytics software, storage, and processing
- Develop customized reports and ad hoc analyses to make recommendations and provide guidance to less experienced staff
- Understand compliance and risk management requirements for supported area
- Ensure adherence to data management or data governance regulations and policies
- Participate in company initiatives or processes to assist in meeting risk and capital objectives and other strategic goals
- Collaborate and consult with more experienced consultants and with partners in technology and other business groups
Required Qualifications:
- 2+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education
Desired Qualifications:
- Hands-on Proficiency in Business Intelligence (BI) particularly Microsoft Power BI, MS Fabric, Power Automate and Power Platforms.
- Hands-on Proficiency in any or all of the programming languages used for analytics & data science such as SAS, Python, PySpark, Spark SQL or Scala.
- Hands-on strong knowledge of SQL and experience with database management systems (e.g., Teradata, PostgreSQL, MySQL, or NoSQL databases).
- Familiarity with data warehousing and big data technologies (e.g., Hadoop, Spark, Snowflake, Redshift).
- Experience with ELT/ETL tools and data integration techniques.
- Experience optimizing code for performance and cost.
- Comfortable with using code and agile process management tools like GitHub and JIRA.
- Someone with an exposure towards developing solutions for high volume, low latency applications and can operate in a fast paced, highly collaborative environment.
- Provide production support for data assets and products as required.
- Knowledge of data modelling and data warehousing best practices.
- Understanding of data governance, data quality, and data security principles.
- Strong problem-solving and communication skills.
- Ability to work in a collaborative team environment.
- Knowledge of cloud platforms (e.g., Azure and/OR Google Cloud) is a plus.
Job Expectations:
- Design, develop, and maintain ETL (Extract, Transform, Load) / ETL processes and data pipelines to move and transform data from various sources into a centralized data repository.
- Design, implement, and optimize data warehouses and data lakes to ensure scalability, performance, and data consistency.
- Create and manage data models to support business requirements, ensuring data accuracy, integrity, and accessibility.
- Integrate data from diverse sources, including databases, APIs, third-party services, and streaming data, and ensure data quality and consistency.
- Cleanse, transform, and enrich raw data to make it suitable for analysis and reporting.
- Implement and enforce data security measures to protect sensitive information and ensure compliance with data privacy regulations (e.g., GDPR, HIPAA).
- Independently build, operate, maintain, enhance, publish and sunset BI Products (own end-to-end life cycle) across enterprise stakeholders along with up-to-date maintenance of all required documentation and artefacts such as SOPs, previous versions, secondary quality reviews, etc. in various BI tools such as Tableau, PowerBI, etc.
- Continuously monitor and optimize data pipelines and databases for improved performance and efficiency.
- Develop and implement automated testing procedures to validate data quality and pipeline reliability.
- Maintain thorough documentation of data processes, schemas, and data lineage to support data governance efforts.
- Collaborate with wider team such as data scientists, analysts, software engineers, and other stakeholders to understand their data requirements and provide data solutions that meet their needs.
- Utilize version control systems to manage code and configurations related to data pipelines.
- Diagnose and resolve data-related issues and provide technical support as needed.
- Working Hours: 1:30PM-10:30PM India Time
17 Sep 2025
Wells Fargo Recruitment and Hiring Requirements:
b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.