Your Responsibilities:
- Design and implement data pipelines for seamless data ingestion.
- Utilize AWS technologies, including S3, Glue, and Lambda, for effective data management.
- Collaborate with cross-functional teams to enhance data processes using Python, PySpark, and SQL.
- Implement CI/CD deployments for streamlined development processes.
- Understand Data Warehouse concepts to ensure efficient data storage and retrieval.
- Orchestrate data workflows using AWS Step Functions and Apache Airflow.
Skills and attributes for success:
- Minimum of 5 years of hands-on experience in data engineering.
- Proficient in AWS services (S3, Glue, Lambda, Redshift, Airflow, Step Functions).
- Strong programming skills in Python and PySpark.
- Excellent SQL skills for data manipulation and analysis.
Ideally you’ll also have
- Experience with Snowflake for data warehousing solutions.
- AWS Data Engineer Associate Certification
Qualification
What we look for
People with the ability to work in a collaborative way to provide services across multiple client departments while adhering to commercial and legal requirements. You will need a practical approach to solving issues and complex problems with the ability to deliver insightful and practical solutions.
If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible