The point where experts and best companies meet
Share
Responsibilities:
Integrates subject matter and industry expertise within a defined area.
Contributes to data analytics standards around which others will operate.
Applies in-depth understanding of how data analytics collectively integrate within the sub-function as well as coordinate and contribute to the objectives of the entire function.
Employs developed communication and diplomacy skills are required in order to guide, influence and convince others, in particular colleagues in other areas and occasional external customers.
Resolves occasionally complex and highly variable issues.
Produces detailed analysis of issues where the best course of action is not evident from the information available, but actions must be recommended/ taken.
Responsible for volume, quality, timeliness and delivery of data science projects along with short-term planning resource planning.
Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency.
Qualifications:
6-10 years experience using codes for statistical modeling of large data sets
5+ years of experience in Hadoop/big data technologies.
3+ years of experience in spark.
2+ experience in Snowflake
2+ year of experience working on Google or AWS cloud developing data solutions. Certifications preferred.
Advanced knowledge of the Hadoop ecosystem and Big Data technologies Hands-on experience with the Hadoop eco-system (HDFS, MapReduce, Hive, Pig, Impala, Spark, Kafka, Kudu, Solr)
Comprehensive knowledge of the principles of software engineering and data analytics
Knowledge of agile(scrum) development methodology is a plus.
Strongdevelopment/automationskills
System level understanding - Data structures, algorithms, distributed storage & compute
Can-do attitude on solving complex business problems, good interpersonal and teamwork skills.
Education:
Bachelor’s/University degree or equivalent experience, potentially Masters degree
This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required.
Integral member of Data Engineering team, responsible for design and development of Big Data solutions. Partner with domain experts, product managers, analysts, and data scientists to develop Big Data pipelines in Hadoop for regulatory requirements.
Produces detailed analysis of issues where the best course of action is not evident from the information available, but actions must be recommended/taken. Resolves occasionally complex and highly variable issues.
Responsible for moving all legacy workloads to EAP platform for regulatory reporting.
Define needs around maintainability, testability, performance, security, quality and usability for data platform.
Drive implementation, consistent patterns, reusable components, and coding standards for data engineering processes.
Tune Big data applications on Hadoop and non-Hadoop platforms for optimal performance.
Be the technical expert and mentor other team members on Big Data and Cloud Tech stacks.
Anticipated Posting Close Date:
View the " " poster. View the .
View the .
View the
These jobs might be a good fit