Finding the best job has never been easier
Share
Export Control Requirement: Due to applicable export control laws and regulations, candidates must be a U.S. citizen or national, U.S. permanent resident (i.e., current Green Card holder), or lawfully admitted into the U.S. as a refugee or granted asylum.Key job responsibilities
* Design and implement data pipelines for processing geospatial data from various sources
* Develop and maintain APIs for service consumption
* Create and optimize spatial databases and queries for performance at scale
* Build tools for visualizing and analyzing geospatial data
* Collaborate with cross-functional teams to deliver solutions
* Implement data quality controls and validation processes
* Contribute to geospatial data management best practices
* Design systems to track and report on service metrics
* Implement solutions for data integration from multiple sourcesA day in the life
You'll develop and maintain scalable data pipelines processing terabytes of geospatial data, create APIs powering mission-critical services, and build tools that directly impact our global network deployment decisions. Your solutions will help determine where and how we deploy ground infrastructure to maximize coverage for underserved communities worldwide.
- 3+ years of data engineering experience
- Experience with data modeling, warehousing and building ETL pipelines
- Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS
- Experience as a data engineer or related specialty (e.g., software engineer, business intelligence engineer, data scientist) with a track record of manipulating, processing, and extracting value from large datasets
- Knowledge of professional software engineering & best practices for full software development life cycle, including coding standards, software architectures, code reviews, source control management, continuous deployments, testing, and operational excellence
These jobs might be a good fit