The point where experts and best companies meet
Share
What you'll be doing:
Design and implement framework modules to extract data from various source systems, validate the integrity, apply business transformation, and store data in a Data Lake (AWS, Azure, others)
Enable a self-service and Machine Learning platform for individual business units
Take ownership of data platform and transformation tools used in AI&ML
Build, scale, and optimize self-serve ETL frameworks, streaming pipelines to handle data storage and real-time analytics
What we need to see:
Bachelor’s or Master’s degree in Computer Science or Information System, or equivalent experience with programming knowledge (i.e. Python, Database architectures, etc.)
5+ years of relevant experience
Experience in architecting, designing, developing, and maintaining data warehouses/data lakes for complex data ecosystems
Working knowledge of Amazon Web Services, Kubernetes, Docker, Terraform
Strong Python experience, with focus in data extraction and transformations
Ways to stand out from the crowd:
Do you have a strong understanding of operational processes in semi-chips, boards, systems, and servers?
In-depth experience in crafting ETL pipelines using Spark, SQL, AWS/Cloud
Experience with SAP systems integration, Datamart, Business objects as well as experience with visualization tools like Tableau, PowerBI, Jupyter Notebooks
You will also be eligible for equity and .
These jobs might be a good fit