Finding the best job has never been easier
Share
You’ll join a diverse team of software, hardware, engineers, supply chain specialists, security experts, product and operations managers, and other vital roles. You’ll collaborate with people across AWS to help us deliver the highest standards for safety and security while providing seemingly infinite capacity at the lowest possible cost for our customers. And you’ll experience an inclusive culture that welcomes bold ideas and empowers you to own them to completion.You will have a passion to dive deep, a high level of customer focus and a track record in process improvement. This role requires an individual with excellent analytical abilities, strong knowledge of data engineering solutions and the ability to work with Finance, quantitative and business teams. You will lead multiple automation and controllership initiatives across the Finance org. You will be primarily using but not limited to AWS solution stacks like Redshift, S3, Glue, Lambda, SNS, SQS, Cloudwatch, EC2, Lambda, Data pipeline and reporting tools such as Tableau and Alteryx/KNIME to implement solutions. You will be responsible for the full software development life cycle to build scalable application and deploy in AWS Cloud.Key job responsibilities
- Design and develop the pipelines required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Python and AWS big data technologies.
- Oversee and continually improve production operations, including optimizing data delivery, re-designing infrastructure for greater scalability, code deployments, bug fixes and overall release management and coordination.
- Establish and maintain best practices for the design, development and support of data integration solutions, including documentation.
- Collaborate with Finance, Tax, Supply Chain, Procurement, and Engineering to capture requirements and deliver analytics solutions.
- Able to read, write, and debug data processing and orchestration code written in SQL/Python following best coding standards (e.g. version controlled, code reviewed, etc.)
- Apply automation so that with every iteration on a problem, you build your solution to have maximum scale and self-service ability by stakeholders.- Participate in strategic and tactical planning discussions to interface with business customers, gathering requirements and delivering complete reporting solutions.
- 7+ years of data engineering experience
- Experience with data modeling, warehousing and building ETL pipelines
- Experience with SQL
- Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS
- Experience mentoring team members on best practices
- Knowledge of distributed systems as it pertains to data storage and computing
- Experience building data products incrementally and integrating and managing data sets from multiple sources
- Experience communicating with users, other technical teams, and management to collect requirements, describe data modeling decisions and data engineering strategy
- Experience operating large data warehouses
- Knowledge of professional software engineering & best practices for full software development life cycle, including coding standards, software architectures, code reviews, source control management, continuous deployments, testing, and operational excellence
- Experience providing technical leadership and mentoring other engineers for best practices on data engineering
These jobs might be a good fit