The point where experts and best companies meet
Share
What you’ll be doing:
Collaborate with multi-functional teams, including full stack engineers, data scientists, data engineers, and DevOps, to craft, implement and maintain scalable data pipelines, infrastructure and workflows.
develop and support data models, schemas, and database structures that provide blazing-fast analytics performance. Constantly optimize data workflows to ensure seamless real-time and batch data processing.
Implement data quality checks and monitoring mechanisms to ensure data integrity and accuracy at every stage. detect and resolve bottlenecks swiftly to maintain a highly available and reliable data ecosystem.
Control, monitor and optimize worldwide production servers.
Build and maintain scalable data pipelines, ensuring efficient data storage, retrieval, and transformation.
Stay ahead of the curve in data engineering technologies and trends. Introduce new tools, techniques, and best practices to improve our data infrastructure, empowering data scientists and analysts alike.
Foster a culture of learning and knowledge-sharing within the team. Guide and mentor junior data engineers, applying your expertise to uplift the entire department.
What we need to see:
Bachelor’s or master’s degree in computer science, Engineering, or a related field.
4+ years of demonstrated track record as a hands-on Data Engineer, driving the successful delivery of complex data projects. Your experience speaks volumes.
Experience with cloud-based data platforms like AWS, Azure, or GCP. You harness the power of the cloud to unlock data's full potential.
Strong programming skills in languages such as Python, Java, or Scala, with experience in building scalable and efficient systems.
Experience with containerization technologies like Docker and orchestration tools like Kubernetes.
Solid knowledge in Linux environment.
Ways to stand out from the crowd:
Attention to detail is unwavering. You take pride in delivering data of the highest quality, turning it into insights that power critical business decisions.
Experience working and data engineering frameworks such as Kafka, Cloudera, Spark, Airflow or Hadoop.
Thriving in a dynamic environment, able to juggle multiple priorities admirably without compromising on quality or precision.
These jobs might be a good fit