The point where experts and best companies meet
Share
Job responsibilities
Develops secure high-quality production code, and reviews and debugs code written by others.
Conducting hands on POCs to prove concepts/products
Migration to internal and external clouds
Migration to Microservices architectures and patterns
Evaluating open source and vendor products
Design distributed applications.
Required qualifications, capabilities, and skills
Formal training or certification on software engineering concepts and 5+ years applied experience
Solid programming skills with Java, Python or other equivalent languages.
Experience across the data lifecycle, building Data frameworks, working with Data lakes.
Experience with Batch and Real time Data processing with Spark or Flink
Working knowledge of AWS Glue and EMR usage for Data processing
Experience working with Databricks
Experience working with Python/Java, PySpark etc.
Working experience with both relational and NoSQL databases
Experience in ETL data pipelines both batch and real-time data processing, Data warehousing, NoSQL DB.
Strong analytical and critical thinking skills.
Good understanding of machine learning algorithms, including time series algorithms, deep learning, reinforcement learning, and classical ML techniques will be a plus.
Preferred qualifications, capabilities, and skills
Cloud computing: Google Cloud, Amazon Web Service, Azure, Docker, Kubernetes.
Experience in big data technologies: Hadoop, Hive, Spark, Kafka.
Experience in distributed system design and development
These jobs might be a good fit