

Overview of the role
The Business Research Analyst will be responsible for Data and Machine learning part of continuous improvement projects across the Discoverability space. This will require collaboration with local and global teams. The Research Analyst should be a self-starter who is passionate about discovering and solving complicated problems, learning complex systems, working with numbers, and organizing and communicating data and reports. The Research Analyst will perform Big data analysis to identify patterns, train model to generate product to product relationship and product to brand & model relationship. The Research Analyst is also expected to continuously improve the ML/LLM solutions in terms of precision & recall, efficiency and scalability. The Research Analyst should be able to write clear and detailed functional specifications based on business requirements.Key job responsibilities
• Scoping, driving and delivering complex projects across multiple teams.
• Performs root cause analysis by understanding the data need, get data / pull the data and analyze it to form the hypothesis and validate it using data.
• Build programs to create a culture of continuous improvement within the business unit, and foster a customer-centric focus on the quality, productivity, and scalability of our services.
• Find the scalable solution for business problem by executing pilots and build Deterministic and ML/LLM models.
• Manages meetings, business and technical discussions regarding their part of the projects.
• Makes recommendations and decisions that impact development schedules and the success for a product or project.
• Drives team(s)/partners to meet program and/or product goals.
• Coordinates design effort between internal team and External team to develop optimal solutions.
• Performs supporting research, conduct analysis of the bigger part of the projects and effectively interpret reports to identify opportunities, optimize processes, and implement changes.
• Ability to convince and interact with stakeholders at all level either to gather data and information or to execute and implement according to the plan.
• Ability to deal with ambiguity and problem solver
• Communicate ideas effectively and with influence (both verbally and in writing), within and outside the team.Key Performance Areas:
• Solve large and complex business problems by aligning multiple teams together.
• Data analytics and Data Sciences
• Machine learning
• Project/Program Management
• Automation initiative conceptualization and implementation
• Big Data analytics
• Product development – Scoping and Testing
• Defect Elimination
• Agile Continuous Improvement
- 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience
- Experience with data visualization using Tableau, Quicksight, or similar tools
- Experience with data modeling, warehousing and building ETL pipelines
- Experience writing complex SQL queries
- Experience in Statistical Analysis packages such as R, SAS and Matlab
- Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling
משרות נוספות שיכולות לעניין אותך

At Amazon, we're working to be the most customer-centric company on earth and are constantly looking for ways to improve. Our team drives innovation while supporting continuous operational improvement. To get there, we need exceptionally talented, bright, and driven people. In this role, you will be responsible for developing and scaling up sustainable e-commerce packaging standards and sustainable packaging solutions. To be successful, you will need excellent communication, negotiation and influencing skills to drive consensus across multiple stakeholders. We interface with Operations, Engineering, Category, Finance, Procurement, Data Science, and Global Packaging teams to identify and execute solutions that improve Customer experience by reducing waste and damages in packaging, while also driving significant savings for Amazon.Key job responsibilities
• Define and execute the packaging ML/AI roadmap for the ROW.
• Be the technical expert on packing productivity, box suite optimization, and packaging material specifications and standards.
• Collaborate effectively with worldwide teams to identify, share and implement best practices that improve packaging performance.
• Coordinate and communicate packaging improvements and progress updates to key stakeholders.
Basic qualifications
• Experience in ML and AI models
• Experience using data and metrics to determine and drive improvements
• Experience owning program strategy, end to end delivery, and communicating results to senior leadership
• Working experience in project management owning program strategy, end to end.
• Bachelor’s degree.
• Proficiency in Excel (pivot tables, vlookups, etc.).
• Advanced English communication skills C1+
Preferred qualifications
• Master's degree, or MBA in business, operations, human resources, adult education, organizational development, instructional design or related field
• Knowledge and experience manufacturing various packaging materials; including design and production of prototypes, and new product development
• Consumer packaged goods or e-commerce experience
- 3+ years of program or project management experience
- 3+ years of working cross functionally with tech and non-tech teams experience
- 3+ years of defining and implementing process improvement initiatives using data and metrics experience
- Bachelor's degree
- Knowledge of Excel (Pivot Tables, VLookUps) at an advanced level and SQL
- Experience defining program requirements and using data and metrics to determine improvements
- 3+ years of driving end to end delivery, and communicating results to senior leadership experience
- 3+ years of driving process improvements experience
- Experience in stakeholder management, dealing with multiple stakeholders at varied levels of the organization
- Experience building processes, project management, and schedules

- 3+ years of non-internship professional software development experience
- 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience
- 3+ years of Video Games Industry (supporting title Development, Release, or Live Ops) experience
- Experience programming with at least one software programming language
- 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience
- Bachelor's degree in computer science or equivalent

You will be required to deeply understand technology landscapes, and evaluate the use of new technologies. You will be influential within your team and work with peers and senior leaders to define and revise the standards for operational excellence across systems. You will consistently tackle abstract issues that span multiple functional areas and drive your team to push for improvements that can scale across other teams, services, and platforms.
Key job responsibilities
Identify performance bottlenecks in compute infrastructure and propose solutions to address them.Provide support for cluster and node management, ensuring smooth operation of GenAI infrastructure.
Participate in design and code reviews and identify bottlenecks.
Troubleshoot and research root causes thoroughly and fix defects.
Continuously improve and automate our cluster/capacity/maintenance upgrades.Experienced in setting up and managing CI/CD pipelines using tools such as AWS CodePipeline, GitHub Actions, or similar platforms.
Familiarity with Infrastructure as Code (IaC) tools like AWS CloudFormation, Terraform, or the AWS CDK is a valuable asset. Furthermore, an understanding of networking concepts like VPC, subnets, and security groups, as well as configuring Load Balancers and Route 53, is desirable.
Should have hands-on experience in Kubernetes.
- 3+ years of administrative experience in networking, storage systems, operating systems and hands-on systems engineering experience
- Experience programming with at least one modern language such as Python, Ruby, Golang, Java, C++, C#, Rust
- Experience with Linux/Unix
- Experience with CI/CD pipelines build processes
- Experience with distributed systems at scale

You will be required to deeply understand technology landscapes, and evaluate the use of new technologies. You will be influential within your team and work with peers and senior leaders to define and revise the standards for operational excellence across systems. You will consistently tackle abstract issues that span multiple functional areas and drive your team to push for improvements that can scale across other teams, services, and platforms.
Key job responsibilities
Provide support for cluster and node management, ensuring smooth operation of GenAI infrastructure.
Continuously improve and automate our cluster/capacity/maintenance upgrades.
Troubleshoot and research root causes thoroughly and fix defects.
Develop automation tools for improving operational excellence.Experienced in setting up and managing CI/CD pipelines using tools such as AWS CodePipeline, GitHub Actions, or similar platforms.
Familiarity with Infrastructure as Code (IaC) tools like AWS CloudFormation, Terraform, or the AWS CDK is a valuable asset. Furthermore, understanding of networking concepts like VPC, subnets, and security groups, Load Balancers and Route 53, is desirable.
Should have hands-on experience in Kubernetes.
- 1+ years of systems development experience
- Experience programming with at least one modern language such as Python, Ruby, Golang, Java, C++, C#, Rust
- Experience with Linux/Unix
- Experience with CI/CD pipelines build processes

Part of the team, you will invent, design and develop end-to-end products to make book creation experience simpler and high quality.
Key job responsibilities
• Collaborate with experienced cross-disciplinary Amazonians to conceive, design, and bring innovative products and services to customers.
• Design and implement system architecture and underlying components. Establish design principles, select design patterns, and instill best practices for software development across multiple teams
• Anticipate bottlenecks, make trade-offs, and balance the business needs versus technical constraints.
• Work in an agile, startup-like environment to deliver high-quality software.
- 3+ years of non-internship professional software development experience
- 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience
- 3+ years of Video Games Industry (supporting title Development, Release, or Live Ops) experience
- Experience programming with at least one software programming language
- 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience
- Bachelor's degree in computer science or equivalent

You will play a critical role in driving innovation and advancing the state-of-the-art in evaluating and training AI models. You will work closely with cross-functional teams, including product managers, engineers, and data scientists to ensure that our AI systems are best in class.Key job responsibilities
- Design complex data collections with human participants in response to science needs: author instructions, define and implement quality targets and mechanisms, provide day-to-day coordination of data collection efforts (including planning, scheduling, and reporting), and be responsible for the final deliverables
- Design and conduct complex data creation tasks using synthetic and model-based data generation methods, following state-of-the-art approaches
- Analyze and extract insights from large amounts of data
- Build tools or tool prototypes for data analysis or data creation, using Python or another scripting language
- Use modeling tools to bootstrap or test new AI functionalities
- Master's or higher degree in a relevant field (Computational Linguistics or equivalent field with computational analysis)
- 2+ years experience in computational linguistics or language data processing or AI data creation
- Experience with language data annotation systems and other forms of data markup
- Proficient with scripting languages, such as Python
- Experience working with speech, text, and multimodal data in multiple languages
- Excellent communication, strong organizational skills and very detailed oriented
- Comfortable working in a fast paced, highly collaborative, dynamic work environment
- PhD in Computational Linguistics (or equivalent field with computational emphasis)
- Expertise in bootstrapping AI data collections for quickly evolving requirements
- Extensive experience working with speech, text, and multimodal data in multiple languages
- Experience in data creation for complex agentic workflows
- Practical experience with Machine Learning
- Familiarity with technical concepts such as APIs
- Practical knowledge of version control and agile development
- Familiarity with database queries and data analysis processes (SQL, R, Matlab, etc.)
- Willingness to support several projects at one time, and to accept reprioritization as necessary
- Able to think creatively and possess strong analytical and problem solving skills

Overview of the role
The Business Research Analyst will be responsible for Data and Machine learning part of continuous improvement projects across the Discoverability space. This will require collaboration with local and global teams. The Research Analyst should be a self-starter who is passionate about discovering and solving complicated problems, learning complex systems, working with numbers, and organizing and communicating data and reports. The Research Analyst will perform Big data analysis to identify patterns, train model to generate product to product relationship and product to brand & model relationship. The Research Analyst is also expected to continuously improve the ML/LLM solutions in terms of precision & recall, efficiency and scalability. The Research Analyst should be able to write clear and detailed functional specifications based on business requirements.Key job responsibilities
• Scoping, driving and delivering complex projects across multiple teams.
• Performs root cause analysis by understanding the data need, get data / pull the data and analyze it to form the hypothesis and validate it using data.
• Build programs to create a culture of continuous improvement within the business unit, and foster a customer-centric focus on the quality, productivity, and scalability of our services.
• Find the scalable solution for business problem by executing pilots and build Deterministic and ML/LLM models.
• Manages meetings, business and technical discussions regarding their part of the projects.
• Makes recommendations and decisions that impact development schedules and the success for a product or project.
• Drives team(s)/partners to meet program and/or product goals.
• Coordinates design effort between internal team and External team to develop optimal solutions.
• Performs supporting research, conduct analysis of the bigger part of the projects and effectively interpret reports to identify opportunities, optimize processes, and implement changes.
• Ability to convince and interact with stakeholders at all level either to gather data and information or to execute and implement according to the plan.
• Ability to deal with ambiguity and problem solver
• Communicate ideas effectively and with influence (both verbally and in writing), within and outside the team.Key Performance Areas:
• Solve large and complex business problems by aligning multiple teams together.
• Data analytics and Data Sciences
• Machine learning
• Project/Program Management
• Automation initiative conceptualization and implementation
• Big Data analytics
• Product development – Scoping and Testing
• Defect Elimination
• Agile Continuous Improvement
- 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience
- Experience with data visualization using Tableau, Quicksight, or similar tools
- Experience with data modeling, warehousing and building ETL pipelines
- Experience writing complex SQL queries
- Experience in Statistical Analysis packages such as R, SAS and Matlab
- Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling
משרות נוספות שיכולות לעניין אותך