המקום בו המומחים והחברות הטובות ביותר נפגשים
What You’ll Do:
Lead design and build Enterprise Level scalable, low-latency, fault-tolerant streaming data platform that provides meaningful and timely insights
Build the next generation Distributed Streaming Data Pipelines and Analytics Data Stores using streaming frameworks (e.g. Flink, Spark Streaming, etc.) using programming languages like Java, Scala, Python
Lead a group of engineers building data pipelines using big data technologies (Spark, Flink, Kafka, Snowflake, AWS Big Data Services, Snowflake, Redshift) on medium to large scale datasets
Influence best practices for Data Pipeline design, Data architecture and processing of structured and unstructured data.
Work in a creative & collaborative environment driven by agile methodologies with focus on CI/CD, Application Resiliency Standards, and partnership with Cyber & Security teams
Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community
Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies
Work with a team of developers with deep experience in machine learning, distributed microservices, and full stack systems
Collaborate with digital product managers, and deliver robust cloud-based solutions that drive powerful experiences to help millions of Americans achieve financial empowerment
Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance
Basic Qualifications:
Bachelor’s Degree
At least 6 years of experience in application development (Internship experience does not apply)
At least 2 years of experience in big data technologies
At least 1 year experience with cloud computing (AWS, Microsoft Azure, Google Cloud)
Preferred Qualifications:
7+ years of experience in application development including Java, Python, SQL, Scala
4+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud)
4+ years experience with Distributed data computing tools (Flink, Kafka, Spark etc)
4+ year experience working on real-time data and streaming applications
4+ years of experience with NoSQL implementation (DynamoDB, OpenSearch)
4+ years of data warehousing experience (Redshift or Snowflake)
4+ years of experience with UNIX/Linux including basic commands and shell scripting
2+ years of experience with Agile engineering practices
. Eligibility varies based on full or part-time status, exempt or non-exempt status, and management level.
If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1-800-304-9102 or via email at . All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations.
משרות נוספות שיכולות לעניין אותך