המקום בו המומחים והחברות הטובות ביותר נפגשים
explain what we do and
The Big Data Lead Engineeris responsible for building solutions to build Data Engineering Solutions using next generation data techniques. The position will also be liaising between business users and technologists to exchange information about solutions, including requirements and usage information
Responsibilities:
Integral team member of our Data Engineering team responsible for design and development of Big data solutions Partner with domain experts, product managers, analyst, and data scientists to develop Big Data pipelines in Hadoop or Snowflake Responsible for delivering data as a service framework
Develop, optimize, and manage large-scale data processing systems and analytics platforms.
Work with data scientist to build Client pipelines using heterogeneous sources and provide engineering services for data science applications
Ensure automation through CI/CD across platforms both in cloud and on-premises
Ability to research and assess open source technologies and components to recommend and integrate into the design and implementation
Be the technical expert and mentor other team members on Big Data and Cloud Tech stacks
Define needs around maintainability, testability, performance, security, quality and usability for data platform
Drive implementation, consistent patterns, reusable components, and coding standards for data engineering processes
Tune Big data applications on Hadoop and non-Hadoop platforms for optimal performance
Evaluate new IT developments and evolving business requirements and recommend appropriate systems alternatives and/or enhancements to current systems by analyzing business processes, systems and industry standards.
Supervise day-to-day staff management issues, including resource management, work allocation, mentoring/coaching and other duties and functions as assigned
Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency.
Qualifications:
8+ years of experience in hadoop/big data technologies.
2-4 years of experience in Kafka streaming development including design, development, implementation through CI/CD pipeline and validation of real-time data streaming solutions for high performance applications.
2-4 years of experience in API and microservices development to design, build, and maintain scalable, secure , and high-performance microservices architecture and API solutions.
2-4 years of experience as a python developer with expertise in automation testing to design, develop, and automate robust software solutions and testing frameworks
Experience in managing and implementing successful projects
Subject Matter Expert (SME) in at least one area of Applications Development
Ability to adjust priorities quickly as circumstances dictate
Demonstrated leadership and project management skills
Consistently demonstrates clear and concise written and verbal communication
Education:
Bachelor’s degree/University degree or equivalent experience
Master’s degree preferred
This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required.
Anticipated Posting Close Date:
View the " " poster. View the .
View the .
View the
משרות נוספות שיכולות לעניין אותך