Share
See some more on BigBrain here:
See some more on BigBrain here:
As a Data Engineer in the BigBrain Core team, you will be part of a growing group that’s building the best-in-class data ecosystem to support monday.com’s hyper-scale and most critical data needs.
You will design, implement, and support robust, scalable data solutions, and take ownership of core components such as the Data Lake, batch and streaming infrastructure, high-scale data pipelines, and distributed systems.
In addition, you will play a key role in empowering our infrastructure with AI capabilities – from building AI-assisted data pipelines and smart monitoring tools, to supporting AI-driven features and services across the organization.
● 3+ years of experience as a data engineer / big data developer.
● 3+ years of experience with a programming language (python, java, scala, nodejs).
● Experience with designing data pipelines & data lakes.
● Experience with big data tools: Hadoop, Spark, Kafka, etc.
● Experience with SQL and database architecture.
● Experience with workflows and data processing pipelines like AirFlow.
● Experience working in a cloud environment, preferably AWS.
● Experience with Big Data architectures.
● Building and designing large-scale applications.
These jobs might be a good fit