Evolution of Amazon Return Experience
A/B Testing - Launched in Australia 🇦🇺

Context
Amazon's returns experience today relies on a standardised multistep click-based flow where customers select from pre-defined return reasons. While our delivery speed leads SICs, our returns experience lags – Returns Satisfaction [RSAT] stands at 2.7 (out of 5). With ORC abandonment rate (customers dropping off without taking any action) trending at 32%, this highlights that customers struggle with the current experience, which requires multiple attempts to complete a return.
Project Details
My role
Product designer
Interaction design
Who I worked with
Product managers
Data scientist
SDE
Scope
Retail experience
Android & IOS
Timeline
MVP - Jan to April 2024
P0 - Jul to Dec 2024
Problem statement
Current CX
Involves clicking through 6 screens, multiple decision points, high cognitive load, text-heavy interface.
P0 - A/B testing CX
AI-powered conversational return CX. Instead of customers adapting to Amazon, system will adapt to their needs, using intelligent processing and contextual recommendations to simplify resolution discovery.
It not only makes the experience more intuitive for customers, but also enables Amazon to capture more detailed, accurate return reasons.
Outcome
Launched for Australia - Reduced returns related CS contacts in last 30 days
P1 - Planned CX
Cultural and Linguistic Authenticity: Direct translation produces awkward, unnatural conversations. E.g. वह बहुत तेज है। gets translated as "He is very fast.”, but context aware translation would be "He is very smart."
We need ~10K benchmark conversations per language that capture linguistic nuances, edge cases, and natural conversation flow.





