Return Flow V2

Creating a more fluid and flexible shopper returns flow

Company

Returnly


Project name

Shopper Flow V2


Role

Senior UX Designer


Date

Aug 19

Returnly is a start-up focussing on improving the online returns experience for shoppers in the US. The ‘shopper flow’ is the user journey that a consumer goes through in order to initiate and complete a return online. It’s a fundamental part of the product and value proposition.


They already had an existing flow but the internal impression was that whilst this ‘worked OK’ and was functional to a point, it didn’t feel ‘modern’ anymore. In addition the market had become more crowded and parts of the business strategy (e.g. omni-channel) would require fundamental rewriting of large pieces of code in the near future.


The UX team was asked to research and explore a new shopper flow vision that would keep the business competitive and support the product strategy more easily. Given the importance I used this project to teach the team some design thinking with the ‘design sprint’ process by Google Ventures and Jake Knapp.

Foundational research

The first thing I did was conduct some end to end moderated user testing of the existing shopper flow. Given the focus of the project it was highly likely we would be proposing some big changes so I knew it was important to understand where we were and how customers felt about the current experience as well as qualifying some of the internal perception.


This would allow us to create a vision that not only fixed the existing usability issues but also included any strengths of the existing platform. Given how young design was as a discipline at Returnly I didn’t have the benefit of existing formative research or analytical data that I could review beforehand so this would therefore serve as a foundational stone on which we could build the experience and rationale for any decisions. I invited all product managers and key stakeholders to the live sessions so they could watch with me in real time and build a shared understanding of the problems we faced.

Moderated user testing using PingPong

Initial findings

In general three areas suffered the most usability problems

1) The landing page where users made a series of errors trying to begin the flow (and would therefore be a key headwind to overall conversion)


2) Confidence when selecting items particularly when the user was in an exchange scenario as the flow was more complicated


3) Finding their way back to the return confirmation page to check the status of an existing return as the email communications were not consistent (I faked as much of this as we could to get a realistic end to end impression)


I summarised these in a spreadsheet ranked by severity and shared it with product management. Going through the flow end to end also helped me start to formulate interesting ideas to some of the problems I observed and I took some notes of these for incorporating into the design sprint I was already planning

Spreadsheet of usability issues

Early idea notes

Getting a handle on analytics

As Returnly was a young start-up the available web analytics that we could review to understand real shopper behaviour on the platform was limited.


I knew this could be a potential risk to the success of the project and design sprint methodology itself if we could not empirically prove positive impact.


I started listing the metrics and KPIs that I felt were missing and working with a colleague in product management we set about analysing and prioritising this further.


We knew we would not have everything we wanted ready for day 1 of the design sprint but we worked hard to prioritise the most meaningful items that would give the greatest insight in the shortest possible timeframe. We eventually came up with a large spreadsheet of ranked and actionable items for the engineering teams to pick up and completely re-diagrammed our events in Mix panel to make sure had not forgotten anything.

Design sprint prep

Design as a discipline at Returnly was young so I knew I would have to do a lot of education around what a Design Sprint was, how it worked and in particular how to step through the workshops that would be required.


I convinced two of my colleagues from the US to come to Madrid for two weeks so we could focus and get through as much of the design sprint as possible in that time (the 'official' time frame is a lot shorter but this was our first attempt and we were a very small team so I felt building understanding was more important).


I prepared a shared presentation in the weeks before the design sprint was scheduled so we could walk through everything we knew together on day 1 of the sprint and I assigned a focus area to each member of the team (competitive analysis, analytics, business goals, previous product ideas etc) to collect relevant inputs.


Lighting talks and ‘how might we’

On day 1 of the design sprint the team spent the day sharing and discussing insights from their assigned area to build collective understanding across the working group. As each person spoke everyone took post it notes of key points that resonated with them in the form of 'how might we…' statements. We then collected all of these at the end of the session and put them in a shared place ready for further discussion.

Affinity sorting and voting

On day 2 of the design sprint the team first recapped the highlights of day 1 and then worked together to start grouping the how might we statements into similar themes.


These were then voted on in an individual fashion so that we could start to focus on what the group felt were the most thought provoking points that deserved to be explored further.


A set of 3-4 specific areas of focus emerged from this exercise with the most votes which we then expanded on by defining the customer demographics, pain points and success criteria in detail.


Through this exercise we also defined a lot of missing analytics that needed to be setup and some key business / product questions that we had not asked ourselves before. In the breaks between sessions this prompted us to try and answer some of these in order to help influence the direction of the design sprint (more info below).

Idea lists and Crazy 8's

On day 3 of the design sprint we started to switch from analysing the problem to creating (early) solutions.


Taking the previously defined focus area pain points the team created individual ‘idea lists’ by trying to think of as many ways to solve these as possible. Then we had several rounds of an exercise called ‘Crazy 8's’ to help illustrate the best ideas back to the group.


At this stage we were after volume not detail but already we started to see some patterns emerging between the sketches and so started to isolate and rank these.

Additional insight

We knew before the first workshop sessions that we had missing analytical data and that there were key business questions that we needed to answer in order to feel more confident about the user experience direction that was starting to emerge in the sketches.


We'd already identified usability problems in the item selection page of our flow from the earlier usability testing so we decided to prioritise some of the questions we had in this area such as 'What is the average number of items a customer returns?' and 'What is the breakdown between exchanges and returns' for example.


Knowing the answers to these questions would help identify which of the new ideas had merit. From some analysis between the workshops sessions we discovered that most of our customers returned one item and had a low number of items in the original order. In addition most returns were for a refund and not an exchange which was a simpler flow. This insight directly affected the design sprint and we doubled down on those ideas and journeys that supported this use case first.

Additional analytics insight

Solution sketches

On day 4 of the design sprint we started to focus further and iterate on some of the earlier low fidelity sketches.


I divided the team into 3 groups of 2 people each – pairing people together who had similar thoughts / ideas or chose to focus on the same specific part of the product. This gave us an opportunity to combine and consolidate the best ideas together and take the solutions to the next stage.


Each group spent some time discussing and elaborating on their original idea and then presented back to the overall group for further feedback. Some of the sketches prompted some live white-boarding between the group, building on top of the ideas already shared. A wireframe started to take shape of an optimised 1 page / 1 item flow as we tried to take onboard the additional analytical data we had discovered during the sprint above, as well as trying to address the usability problems of our existing flow.

User flow and wireframe ideation

On day 5 of the design sprint we went back to the early wireframe that had been the outcome of the sketching phases and iterated further on this to make things clearer. We also detailed the overall user flow to make sure we had not forgotten anything and that the key parts of the new design would fit in with the end to end experience. We also started to explore alternate UI approaches to the item selection area.

Are we on track?!

At this point the design sprint had been running for a week and we were starting to get into the detail of wireframes and user flows.


For many of the team it was the first time they had taken part in design thinking workshops so I wanted to make sure we were still on track to solve the fundamental pain points of users we had identified earlier and were not going 'off-piste' with our ideas.


To do that I went back and compared the list of the pain points with the ideas we had given a name to at the end of the solution sketch phase. I looked for correlation between the two and ran through this with the group to make sure everyone felt the inputs matched the output we were heading towards.


Both I and the group felt that whilst we hadn't resolved all the issues more than 50% was actively being addressed with the new design. It also helped identify some of the smaller areas we had forgotten and we resolved to re-incorporate that into the design later (ones with stars).

Comparing early stage design sprint inputs with late stage outputs

Prototyping round 1

From days 5-7 I worked closely with a UI designer to take the wireframe(s) we had created as a group and start to flesh things out more fully.


We used Figma so we could collaborate in real time on the same file and combine work more easily. I took a UX approach – focussing more on the overall flow and direction whereas my colleague started to focus on some of the UI detail and in particular remaking some of our earlier product ideas that were compatible with the new experience we wanted to create (such as how to colour code the exchange vs. return choice).


At this stage we started with the most extreme version of the optimised 1 page / 1 item experience we wanted to create which was to display all user choices on 1 page, including all choices related to the exchange flow. We knew this could make the page more complex but we wanted to explore how far we could push the design.


Prototyping round 2

It was really challenging to get everything to exist on one single page, particularly when you took into consideration the associated choices with an exchange flow which was more complex.


In addition whilst we were optimising for a 1 item return we still needed design scalability and support the use case where a customer was returning more items as if more than 1 item was an exchange the page would get very complex.


Collectively the group also gave feedback that the strong focus on item choices was starting to harm the clarity of the return overview (what items am I returning and what status are they in) which was already an identified usability weakness on our existing flow that needed rectifying.


We came to the conclusion that we needed to move the secondary steps to another place and add some UX hierarchy to avoid them competing for space on the main page. We started to explore modals as a way to achieve this and used progressive exposure to hand hold the user along the journey.


We also hadn’t properly addressed an order that had more than 1 item so I also took this as an opportunity to test how it would feel if we started with an item selection page instead of landing in the overview page directly and auto selecting the first item.


Prototyping round 3

Through the first two rounds of Figma prototyping I was constantly sharing progress and getting feedback from the UI designer that I was working closely with as well as the wider working group.


We had already been collaborating on the overall UX direction but by this time they had already evolved some of the UI in certain parts and I felt it was time to bring the UX and UI together so we could consolidate our work prior to user testing and getting some early customer feedback.


We also added new landing page designs inspired by some of the other earlier sketches the team had created and as well as a new confirmation page that incorporated some aspects of the upcoming business strategy. This then gave us a full end to end flow with which we could test and get end user impressions


PingPong moderated testing

I created a test script (below) and using the prototype above we started to conduct usability testing of our new returns flow.


I was really pleased that it got some great reactions and there were a few 'wow' delighter moments when the users realised how quick the flow felt and they could action everything from one page.


There were still some usability problems however, which was to be expected from a first test. In particular the UI design had incorporated some changes to the mechanism to select another / different product to return and this was missed by participants.


In addition we had also tried a gallery at the top whereby the user could also scroll to see other items eligible for return in the original order but this was misunderstood to be a way to see alternate images for that product not alternate items. We only completed 2 participant tests before I left the company but it was my intention to complete 3 participants (cover the biggest issues + faster iteration) and then iterate quickly on this design.

User testing script