Responsibilities:
- Responsible for design and development of big data solutions. Partner with domain experts, product managers, analyst, and data scientists to develop Big Data pipelines in Hadoop or Snowflake Responsible for delivering data as a service framework
- Responsible for moving all legacy workloads to cloud platform
- Ensure automation through CI/CD across platforms both in cloud and on-premises
- Ability to research and assess open-source technologies, public cloud tech stack (AWS/GCP) components to recommend and integrate into the design and implementation
- Be the technical expert and mentor other team members on Big Data and Cloud Tech stacks
- Define needs around maintainability, testability, performance, security, quality and usability for data platform
- Drive implementation, consistent patterns, reusable components, and coding standards for data engineering processes
- Tune Big data applications on Hadoop and non-Hadoop platforms for optimal performance
- Evaluate new IT developments and evolving business requirements and recommend appropriate systems alternatives and/or enhancements to current systems by analyzing business processes, systems and industry standards.
- Applies in-depth understanding of how data analytics collectively integrate within the sub-function as well as coordinates and contributes to the objectives of the entire function.
- Produces detailed analysis of issues where the best course of action is not evident from the information available, but actions must be recommended/taken.
- Supervise day-to-day staff management issues, including resource management, work allocation, mentoring/coaching and other duties and functions as assigned
- Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency.
Qualifications:
- 6-10 years experience using codes for statistical modeling of large data sets
- 8+ years of experience with Hadoop (Cloudera)/big data technologies
- 5+ years of experience in public cloud infrastructure (AWS or GCP)
- Experience with Kubernetes and cloud-native technologies.
- Experience with all aspects of DevOps (source control, continuous integration, deployments, etc.)
- Advanced knowledge of the Hadoop ecosystem and Big Data technologies Hands-on experience with the Hadoop eco-system (HDFS, MapReduce, Hive, Pig, Impala, Spark, Kafka, Kudu, Solr)
- Experience on designing and developing Data Pipelines for Data Ingestion or Transformation using Java or Scala or Python
- Experience with Spark programming (pyspark or scala or java)
- Expert level building pipelines using Apache Spark Familiarity with core provider services from AWS, Azure or GCP, preferably having supported deployments on one or more of these platforms
- Hands-on experience with Python/Pyspark/Scala and basic libraries for machine learning is required;
- System level understanding - Data structures, algorithms, distributed storage & compute
- Can-do attitude on solving complex business problems, good interpersonal and teamwork skills
- Possess team management experience and have led a team of data engineers and analysts.
- Experience in Snowflake or Delta lake is a plus.
- Basic knowledge of Linux systems, operating system and networking internals
Education:
- Bachelor’s degree/University degree or equivalent experience
This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required.
Responsibilities:
- Integrates subject matter and industry expertise within a defined area.
- Contributes to data analytics standards around which others will operate.
- Applies in-depth understanding of how data analytics collectively integrate within the sub-function as well as coordinate and contribute to the objectives of the entire function.
- Employs developed communication and diplomacy skills are required in order to guide, influence and convince others, in particular colleagues in other areas and occasional external customers.
- Resolves occasionally complex and highly variable issues.
- Produces detailed analysis of issues where the best course of action is not evident from the information available, but actions must be recommended/ taken.
- Responsible for volume, quality, timeliness and delivery of data science projects along with short-term planning resource planning.
- Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency.
Qualifications:
- 6-10 years experience using codes for statistical modeling of large data sets
Education:
- Bachelor’s/University degree or equivalent experience, potentially Masters degree
This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required.
Full timeIrving Texas United States$125,760.00 - $188,640.00
Anticipated Posting Close Date:
Apr 17, 2025View the " " poster. View the .
View the .
View the