המקום בו המומחים והחברות הטובות ביותר נפגשים
Work with the DB Lead to create and maintain optimal data pipeline architecture
Assemble large, complex data sets that meet functional / non-functional business requirements
Build the ETL jobs required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Google ‘big data’ technologies
Build APIs that expose the data so customers can consume and integrate with our data
Evaluate large and complex queries / stored procedures and recommend changes to optimize performance
Assist with the expansion into public cloud cost transparency
Lead automation efforts leveraging inner source interaction models with other teams
Assist data analysts on our team with query development, statistical analysis, and data visualization as needed
Maintain technical documentation on stored procedures and ETL logic used to manage the database and data ingestion pipeline
Collaborating with cross-functional teams to understand data needs and delivered solutions
BS in computer science, Information systems or equivalent field, MS preferred
8+ years working experience as a Data Engineer
Experience building and optimizing ETLs, data pipelines, architectures, and data aggregates
Proficient in Microsoft SQL Server 2016 & 2019 Database Development (T-SQL), Data Analysis and Support.
Thorough understanding of RDBMS concepts and ability to write complex SQL queries, Stored procedures, Functions & triggers
Proficiency with Java and Python languages
Basic experience with database administration (managing access, scheduled jobs, etc.)
Basic experience with Google cloud compute services
Basic experience with Google Big Query
Understand REST API development and design
Very strong foundational knowledge in Object-Oriented Design Principles, Data Structures, Algorithms, and Software Engineering
CI/CD pipeline experience
Experience with code repositories like Git/GitHub
Demonstrated experience working with large data sets and a love for working with data
Experience writing and maintaining technical documentation
Strong communication skills and a constant desire to grow, learn, and explore new things
Knowledge of, or strong interest in learning about FinOps and cloud financial management
Curious and self-driven – when faced with a new problem is capable of seeking out an answer without lots of help and support
Experience with AWS, GCP, Azure cloud services including compute and big data
Experience with DataDog analytics platform
Experience with Gimmel Notebooks
Experience with some visualization tool (e.g. tableau or power BI)
Experience managing database upgrades / data migrations
Experience optimizing queries
Our Benefits:
Any general requests for consideration of your skills, please
משרות נוספות שיכולות לעניין אותך