About the Role
As a Sr. Data Engineer in the Business Applications team, you should be able to work through the different areas of Data Engineering & Data Architecture including the following:
- Data Migration - From Hive/other DBs to Salesforce/other DBs and vice versa
- Data Modeling - Understand existing sources & data models and identify the gaps and build future state architecture
- Data Pipelines - Building Data Pipelines for several Data Mart/Data Warehouse and Reporting requirements
- Data Governance - Build the framework for DG & Data Quality Profiling & Reporting
What the Candidate Will Do:
- Demonstrate strong knowledge of and ability to operationalize, leading data technologies and best practices.
- Collaborate with internal business units and data teams on business requirements, data access, processing/transformation, and reporting needs and leverage existing and new tools to provide solutions.
- Build dimensional data models to support business requirements and reporting needs.
- Design, build, and automate the deployment of data pipelines and applications to support reporting and data requirements.
- Research and recommend technologies and processes to support rapid scale and future state growth initiatives from the data front.
- Prioritize business needs, leadership questions, and ad-hoc requests for on-time delivery.
- Collaborate on architecture and technical design discussions to identify and evaluate high-impact process initiatives.
- Work with the team to implement data governance, and access control and identify and reduce security risks.
- Perform and participate in code reviews, peer inspections, and technical design/specifications.
- Develop performance metrics to establish process success and work cross-functionally to consistently and accurately measure success over time
- Delivers measurable business process improvements while re-engineering key processes and capabilities and maps to future-state vision
- Prepare documentation and specifications on detailed design.
- Be able to work in a globally distributed team in an Agile/Scrum approach.
What the Candidate Will Need:
- Bachelor's Degree in computer science or similar technical field of study or equivalent practical experience.
- 8+ years of professional software development experience, including experience in the Data Engineering & Architecture space
- Interact with product managers, and business stakeholders to understand data needs and help build data infrastructure that scales across the company
- Very strong SQL skills - know advanced level SQL coding (windows functions, CTEs, dynamic variables, Hierarchical queries, Materialized views, etc)
- Experience with data-driven architecture and systems design knowledge of Hadoop-related technologies such as HDFS, Apache Spark, Apache Flink, Hive, and Presto.
- Good hands-on experience with object-oriented programming languages like Python.
- Proven experience in large-scale distributed storage and database systems (SQL or NoSQL, e.g. HIVE, MySQL, Cassandra) and data warehousing architecture and data modeling.
- Working experience in cloud technologies like GCP, AWS, Azure
- Knowledge of reporting tools like Tableau and/or other BI tools.
* Accommodations may be available based on religious and/or medical conditions, or as required by applicable law. To request an accommodation, please reach out to .