Bachelor’s degree in Engineering, Computer Science, a related field, or equivalent practical experience.
1 year of experience coding with one or more programming languages (e.g., Python, Java, C/C++).
1 year of experience designing data pipelines (ETL) and model data for synch and asynch system integration and implementation.
1 year of experience analyzing data, database querying (e.g., SQL), and creating dashboards/reports.
Preferred qualifications:
Master’s degree in Engineering, Computer Science, or a related field.
1 year of experience partnering with stakeholders (e.g., users, partners, customers).
1 year of experience developing project plans and delivering projects on time within budget and scope.
Experience in large-scale distributed data processing.
Experience with Unix or GNU/Linux systems.
Experience designing data models and data warehouses with an understanding of non-relational data storage systems (e.g., NoSQL and distributed database management systems).