Job responsibilities
- Executes standard security solutions in accordance with existing playbooks to satisfy security requirements for internal clients (e.g., product, platform, and application owners)
- Writes secure and high-quality code using the syntax of at least one programming language with limited guidance
- Applies specialized tools (e.g., vulnerability scanner) to analyze and correlate incident data to identify, interpret, and summarize probability and impact of threats when determining specific vulnerabilities
- Supports delivery of continuity-related awareness, training, educational activities, and exercises
- Adds to team culture of diversity, equity, inclusion, and respect
Required qualifications, capabilities, and skills
- Formal training or Certification on Security Engineering concepts and 2+ years of applied experience.
- Demonstrated expertise in agile methodologies, including CI/CD, application resiliency, and security practices.
- Hands on experience in building scalable data pipelines with Spark (SparkSQL) and Airflow or similar scheduling/orchestration tools for large data sets.
- Proficient in big data technologies such as Hadoop, Hive, HBase, Spark, and EMR.
- Skilled in working with MPP frameworks like Presto, Trino, and Impala.
- Experience with AWS big data services (Glue, EMR, Lake Formation, Redshift) or equivalent Apache projects (Spark, Flink, Hive, Kafka).
Preferred qualifications, capabilities, and skills
- Familiar with building stream-processing systems, using solutions such as Storm or Spark-Streaming
- Understanding of Trino,DBT,ETL, SQL scripts, Python programming.