Being the cybersecurity partner of choice, protecting our digital way of life.
Your Impact
- Design, develop, and maintain data Infrastructure to support ETL, real time pipelines, data science and AI/ML workloads
- Build and maintain CICD pipelines to support various workflows.
- Support workflow from various sources into our data warehouse or data lake environment.
- Collaborate with stakeholders to gather requirements and translate business needs into technical solutions.
- Optimize and tune existing data pipelines for performance, reliability, and scalability.
- Implement data quality and governance processes to ensure data accuracy, consistency, and compliance with regulatory standards.
- Mentor junior members of the team and provide guidance on best practices for data engineering and BI development.
Your Experience
- UG/PG degree in Computer Science, Engineering, or a related field or military experience required
- 5 to 8 years of experience in DevOps, Data platform operations and ML workloads, with a focus on building and maintaining data and AI/ML tools.
- Familiarity with cloud platforms such as Google Cloud Platform (GCP), and experience with relevant services (e.g. GCP Dataflow, GCP DataProc, Biq Query, Procedures, Cloud Composer etc).
- Experience with Big data tools Airflow, Kafka, Grafana, Prometheus, LGTM etc.
- Experience with object-oriented/object function scripting languages: Python/Scala, etc
- Strong analytical and problem-solving skills, with the ability to analyze complex data sets and derive actionable insights.
- Excellent communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams.
All your information will be kept confidential according to EEO guidelines.