Job responsibilities
- Executes security solutions design, development, and technical troubleshooting with the ability to apply knowledge of existing security solutions to satisfy security requirements for internal clients (e.g., product, platform, application owners)
- Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems
- Applies specialized tools (e.g., vulnerability scanner) to analyze and correlate incident data to identify, interpret, and summarize the probability and impact of threats when determining specific vulnerabilities
- Leads delivery of continuity-related awareness, training, educational activities, and exercises
- Adds to team culture of diversity, equity, inclusion, and respect
- Design, Build and Deploy scalable ETL pipelines leveraging Trino, pySpark, AWS Services (S3, Glue) for large-scale data processing.
- Leverage AWS Services such as S3, SNS, Athena for storage, data query in the cloud environment.
- Use tools like Kestra/Airflow to automate schedule and monitor complex data workflows ensuring smooth data flows and timely execution.
- Use a wide array of data formats appropriate for building a Modern Data Stack
- Leverage Docker and Kubernetes for containerization and orchestration to achieve scalable deployments.
- Write and Optimize SQL queries for transformation, analysis with focus on performance.
Required qualifications, capabilities, and skills
- Formal training or certification on data engineering concepts and 3+ years applied experience
- Experience developing security engineering solutions
- Proficient in coding in one of more languages
- Overall knowledge of the Software Development Life Cycle
- Solid understanding of agile methodologies such as CI/CD, application resiliency, and security
- Knowledge of AWS services such as S3, Athena, SNS, SQS , Glue
- Experience with Apache Airflow and/or Kestra for automating data flows
- Proficiency in pySpark for data processing
- Proven experience with data formats, open table formats and data partitioning
Preferred qualifications, capabilities, and skills
- Cloud computing: Amazon Web Service, Docker, Kubernetes.
- Experience in big data technologies: Kafka, Iceberg
- Hands on experience on Terraform
- Experience in distributed system design and development
- AWS Certification is a plus