המקום בו המומחים והחברות הטובות ביותר נפגשים
Primary Responsibilities:
Define and Implement data architecture standards, frameworks and guidelines to ensure data platform efficiency and to ensure high quality data for gaining insights / downstream consumptions
Lead the creation of data models to standardize data definitions, relationships and semantics across systems
Collaborate with extended teams and stakeholders to establish data standards, metadata management practices and data quality frameworks
Design scalable architectures that integrate various data sources, systems, and platforms while minimizing duplication
Partner with engineering, data analysis, data science and business teams to align data solutions with business needs. Mentor technical teams in data architecture best practices
Develop comprehensive architectural documentation and communicate data architecture principles to both technical and non-technical stakeholders
The successful candidate will:
Have hands on experience with AWS Cloud data technologies including RDS, DynamoDB, S3 and Glue ETL
Have experience designing and implementing end-to-end data pipelines supporting both production and analytic use cases
Have experience designing and implementing data management solutions that enable Data Quality, Reference Data Management, and Metadata Management.
Be comfortable coding with Python or Scala and proficient in SQL
In-depth understanding of Parquet, DeltaLake and Iceberg data formats
Have a background in using multiple data storage technologies including relational, document, key/value, graph and object stores
Have ability to decompose large problems and execute smaller, manageable bodies of work to demonstrate continuous architecture delivery
Will have an understanding of machine learning and AI data infrastructure needs
Basic Qualifications:
Bachelor's Degree
At least 6 years of experience in Data Engineering and Data Architecture or Technology Solution design
At least 4 years of experience in big data technologies
At least 2 years of Data modeling and platform design
At least 2 years of experience creating solutions architectures in a public cloud (AWS, Microsoft Azure, Google Cloud)
Preferred Qualifications:
Master’s Degree in Computer Science, Engineering or Information Technology
9+ years of experience in application development including Python, SQL, Scala, or Java
5+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud)
5+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Databricks)
4+ year experience working on real-time data and streaming applications
4+ years of experience with NoSQL implementation
4+ years of data warehousing experience (Redshift or Snowflake)
Familiarity with Industry standards related to Data
Experience with data mesh, data lakehouse architectures and real-time data pipelines
. Eligibility varies based on full or part-time status, exempt or non-exempt status, and management level.
If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1-800-304-9102 or via email at . All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations.
משרות נוספות שיכולות לעניין אותך