

Share
Being the cybersecurity partner of choice, protecting our digital way of life.
Your Impact
Design, develop, and maintain data pipelines to extract, transform, and load (ETL) data from various sources into our data warehouse or data lake environment
Aptitude for proactively identifying and implementing GenAI-driven solutions to achieve measurable improvements in the reliability and performance of data pipelines or to optimize key processes like data quality validation and root cause analysis for data issues, is a nice-to-have
Collaborate with stakeholders to gather requirements and translate business needs into technical solutions
Optimize and tune existing data pipelines for performance, reliability, and scalability
Implement data quality and governance processes to ensure data accuracy, consistency, and compliance with regulatory standards
Work closely with the BI team to design and develop dashboards, reports, and analytical tools that provide actionable insights to stakeholders
Mentor junior members of the team and provide guidance on best practices for data engineering and BI development
Your Experience
Bachelor's degree in Computer Science, Engineering, or a related field
5+ years of experience in data engineering, with a focus on building and maintaining data pipelines and analytical solutions
Demonstrated readiness to leverage GenAI tools to enhance efficiency within the typical stages of the data engineering lifecycle, for example by generating complex SQL queries, creating initial Python/Spark script structures, or auto-generating pipeline documentation, is a nice-to-have
Expertise in SQL programming and database management systems
Hands-on experience with ETL tools and technologies (e.g. Apache Spark, Apache Airflow)
Familiarity with cloud platforms such as Google Cloud Platform (GCP), and experience with relevant services (e.g. GCP Dataflow, GCP DataProc, Biq Query, Procedures, Cloud Composer etc).
Experience with Big data tools like Spark, Kafka, etc
Experience with object-oriented/object function scripting languages: Python/Scala, etc
Experience working SFDC Data Objects (Opportunity, Quote, Accounts, Subscriptions, Entitlements) would be highly desired
Experience with BI tools and visualization platforms (e.g. Tableau) is a plus
Strong analytical and problem-solving skills, with the ability to analyze complex data sets and derive actionable insights
Excellent communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams
Compensation Disclosure
The compensation offered for this position will depend on qualifications, experience, and work location. For candidates who receive an offer at the posted level, the starting base salary (for non-sales roles) or base salary + commission target (for sales/commissioned roles) is expected to be between $145000/YR - $235500/YR. The offered compensation may also include restricted stock units and a bonus. A description of our employee benefits may be found .
All your information will be kept confidential according to EEO guidelines.
These jobs might be a good fit