

Share
Key Responsibilities:
Required Skills/Experience:
Preferred Skills/Experience:
Education:
Other Relevant Skills
These jobs might be a good fit

Share
The Lead Data Engineer will be responsible for designing, implementing, and optimizing distributed data processing jobs to handle large-scale data in Hadoop Distributed File System(HDFS) and S3 Storage using Apache Kafka, Flink Java and Flink SQL, Apache Spark and Python. This role requires deep understanding of data engineering principles, proficiency in Java, Python and hands-on experience with Kafka and S3 ecosystems. Developer will collaborate with data engineers, analysts, and business stakeholders to process, transform and drive insights and data driven decisions.Responsibilities:
Qualifications:
Education:
This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required.
Additional Responsibilities:
Data Processing and Transformation:
Data Distribution:
Performance Optimization:
Data Engineering with Hadoop, Spark, Kafka, Flink:
Coding standard adherence:
Time Type:
These jobs might be a good fit

Share
Responsibilities:
Qualifications:
Education:
Anticipated Posting Close Date:
These jobs might be a good fit

Share
The Quality Engineer - Automation is a seasoned professional role. Applies in-depth disciplinary knowledge, contributing to the development of new techniques and the improvement of processes and work-flow for the area or function. Integrates subject matter and industry expertise within a defined area. Requires in-depth understanding of how areas collectively integrate within the sub-function as well as coordinate and contribute to the objectives of the function and overall business. Evaluates moderately complex and variable issues with substantial potential impact, where development of an approach/taking of an action involves weighing various alternatives and balancing potentially conflicting situations using multiple sources of information. Requires good analytical skills in order to filter, prioritize and validate potentially complex and dynamic material from multiple sources. Strong communication and diplomacy skills are required. Regularly assumes informal/formal leadership role within teams. Involved in coaching and training of new recruits Significant impact in terms of project size, geography, etc. by influencing decisions through advice, counsel and/or facilitating services to others in area of specialization. Work and performance of all teams in the area are directly affected by the performance of the individual.Responsibilities:
Qualifications:
Education:
Time Type:
These jobs might be a good fit

Share
Whether you’re at the start of your career or looking to discover your next adventure, your story begins here. At, you’ll have the opportunity to expand your skills and make a difference at one of the world’s most global banks. We’re fully committed to supporting your growth and development from the start with extensive on-the-job training and exposure to senior leaders, as well as more traditional learning. You’ll also have the chance to give back and make a positive impact where we live and work through volunteerism.
Shape your Career with Citi
Citi’sRisk Managementorganization oversees risk-taking activities and assesses risks and issues independently of the front line units. We establish and maintain the enterprise risk management framework that ensures the ability to consistently identify, measure, monitor, control and report material aggregate risks.
Hybrid(Internal Job Title: Risk Reporting Sr. Officer I - C14) based in Mumbai, India.Being part of our team means that we’ll provide you with the resources to meet your unique needs, empower you to make healthy decision and manage your financial well-being to help plan for your future. For instance:
Responsibilities:
As a successful candidate, you’d ideally have the following skills and exposure:
Education:
Time Type:
These jobs might be a good fit

Share
Time Type:
These jobs might be a good fit

Share
Role Overview:
As a Data Quality Engineer in the Global Markets division of a leading investment bank, you will ensure the integrity, accuracy, and consistency of data across critical trading, risk, and compliance systems.
This role requires a technical mindset with strong hands-on skills inYou will work closely with trading technology teams, quants, risk management, and data engineering to implement data quality solutions that ensure high-fidelity financial data for decision-making and regulatory compliance.
Key Responsibilities:
Develop and implement automated data quality checks for real-time and batch trading data.
Support initiatives related to User Acceptance Testing (UAT) process and product rollout into production.
Monitor and validate data pipelines for trade execution, pricing models, risk analytics, and post-trade settlement.
Work with trading desks, quants, and data engineers to identify and resolve data anomalies.
Build data profiling, lineage tracking, and metadata management solutions to ensure traceability and compliance.
Integrate data validation rules into CI/CD pipelines, ensuring continuous data reliability in trading applications.
Conduct root cause analysis for data quality issues and drive corrective actions.
Ensure compliance with global financial regulations
Collaborate with DevOps and Cloud teams to optimize data quality solutions in cloud-based environments (AWS, Azure, GCP).
Leverage AI/ML-based anomaly detection models to proactively detect data inconsistencies.
Key Skills & Qualifications:
Minimum 4 years of experience in Quality Assurance.
Strong experience in automation using Java for Backend testing, API testing
Hands-on experience in automation/tooling using Python
Proficiency in SQL for data validation and querying large datasets.
Strong analytical and troubleshooting skills for debugging data quality issues.
Ability to work in a fast-paced trading environment with cross-functional teams.
Experience embedding data quality tests in DevOps CI/CD pipelines.
Experience working in AWS, Azure, or GCP data services is a plus.
Preferred Qualifications:
Bachelors Degree in Computer Science, Software/Computer Engineering, or a related field.
Time Type:
These jobs might be a good fit

Key Responsibilities:
Required Skills/Experience:
Preferred Skills/Experience:
Education:
Other Relevant Skills
These jobs might be a good fit