0+ years of related experience in a professional role with a Bachelor’s degree; or equivalent experience
Proven capability to apply knowledge and basic problem-solving techniques to define and resolve problems
Experience in big data processing and basic understanding of ETL tools and processes. Familiar with SQL, Hadoop, Spark, Airflow, S3, Mongo DB etc.
Experience with data visualization and reporting tools (e.g., Tableau, Power BI).
Working knowledge of Python and other scripting languages such as Bash or PowerShell
Essential Requirements
Develop technical tools and programming to cleanse, organize and transform data and to maintain, protect and update data structures and integrity on an automated basis
Apply data extraction, transformation and loading techniques to tie together large data sets from a variety of sources
Design, develop and program methods, processes and systems to capture, manage, store and utilize structured and unstructured data to generate actionable insights and solutions
Work in collaboration with data engineers, data scientists, architects and business to design actionable strategic projects and break complex problems into actionable tasks
Build and maintain tooling and infrastructure for automating release, deployment, and upgrades
Desirable Requirements
Experience in developing API and microservices , with familiarity in version control systems like Git and CI/CD tools such as GitLab.
Understanding of containerization technologies like Docker and Kubernetes.