Material (LRTB): Interview guide, consent form, sketch files provided by clients, note-taking doc, cards
Generative interviews & Usability tests
For both clients, I was one of two UXers involved in the entire 7-step-process.
I ran the initial heuristic analysis and prepared the report, performed as both test facilitator and note-taker, analyzed the data and insights obtained, and formulated the final report.
A final report with insights pointing at potential improvements.
Total (i.e. including pre- and postproduction): 2 - 8 weeks
Test sessions: Depending on the size of the project, we spent two days to one week performing tests, with a maximum of 4 tests per day, and each test lasting around 60-90'.
Starting with a little bit of theory
"Usability testing refers to evaluating a product or service by testing it with representative users. Typically, during a test, participants will try to complete typical tasks while observers watch, listen, and takes notes. The goal is to identify any usability problems, collect qualitative and quantitative data, and determine the participant's satisfaction with the product." *
However, a usability test comprises much more than just have someone perform a task on a digital device. There's a pre- and post-production.
In this section, I'll explain this entire 7-step-process, as applied during my projects at Interactius with Oysho and FC Barcelona.
Step 0: What do we want to find out?
Before we can actually start with step 1 of the usability testing process, we have to identify a need. A need that we may tailor our tests to.
So, the first and biggest question is: What do we want to find out?
🏀 In practice
Things we were asked by our clients to test included, among others
Do users understand that the website has two menus?
Sticky navigation bar:
Do users notice that once they start scrolling the navigation bar appears at the bottom and stays there?
Findability of their website:
Do users find my website and how?
Navigation within the website, and findability of specific products and services:
Do users find what they are looking for and how?
Product searching and buying process:
How do users search for a specific product and how do they go about the purchase?
Are the icons used clear?
What about words in English? (reminder: we test in Spain)
How do users handle the checkout?
How easy is it to apply discount codes?
How confident are they?
1️⃣ Plan the test
In this step, we have to allocate time for steps 2️⃣to 7️⃣.
How much time is each action going to take up?
Depending on the size and complexity of a usability testing project, its overall duration may span as little as 3 weeks or several months.
We furthermore need to determine the dates we will conduct the tests on. If we're running usability tests on behalf of a client, the days have to be compatible with both our and our client's schedule so they can assist if they want to (and it's feasible).
Other questions that need to be addressed in this phase refer to the material:
Do we test a real product or a prototype? Who prepares the prototype? Are there any international participants? What languages does the material to be? What categories do we test in the card sorting? ...?
🏀 In practice
We organized two rounds of testing. The first one we performed in a couple of days with eight participants, the second one in one day with four.
The overall time invested in the projects, including the previous heuristic analysis, recruiting participants, evaluating insights, and writing the report, was three weeks.
The tests were done on a combination of prototypes and the client's real website. The prototypes and the flows we wanted to test were prepared by me with material provided by the client.
We tested with 15 participants, and it took us a bit over a week.
The overall time-span, from the first contact with the client until the final presentation with results, was of approx. 8 weeks.
The tests were performed entirely on their current website.
2️⃣ Recruit participants
We have to define the appropriate profile of our participants: What age group are we looking for? Gender, hobbies, professions, culture, shopping habits, ...? What devices do they use, how often and in what situations? ...?
The answers to these questions depend on the target and the aspects of the product we're testing. We may get the answers with the help of the client and his team, but we may have to do some research ourselves.
If we don't have a database with users, we'll have to engage a specialized company to do the recruitment.
🏀 In practice
In our case, we recruited participants for both clients with the help of specialized companies.
We emailed them the different profile requirements as well as our test schedule, and they made sure to find them in time.
We were looking for a female and sporty profile.
The women had to be in a determined age range and online shoppers with a mix of technical maturity.
After the first round of testing, we felt that we needed to narrow down our target a bit more and specified that they also had to be customers of certain other brands.
For this test, we needed international participants.
Together with the client, we decided on the appropriate nationality, gender, and age mix. Other aspects we covered during our search referred to their use of mobile or desktop devices, and if they were soccer fans or generally interested in culture and sports.
‼️ We did have some problems with finding appropriate Asian participants, which slowed down the process and could have led to postponing the test sessions.
‼️ And one test we even had to declare invalid due to the participant not being at all suitable nor sharing any useful information.
3️⃣ Prepare materials
Materials the need preparing relate to ✏️content and 📎formal aspects and include but are not limited to
- formulate contextualized tasks based on declared testing goals
- redact the interview guide
- prepare prototypes or review running website
- write a pre-report based on heuristic analysis
- prepare the note-taking document
- decide the categories for the card sorting and prepare the cards
- prepare consent forms (i.e. translate into all languages required and print enough)
- print a participant list which may include details relevant for the test
- prepare vouchers and signature list
🏀 In practice
The most challenging of all things listed above, I find to be contextualizing the tasks: How do you frame a research goal in a way that a user relates to it and provides great insights?
One of the things Oysho wanted to find out was if their users knew that for sports clothing, they had to through the Oysho Sport menu.
The instruction we gave them was:
A friend of yours has bought a dark blue sports bra that you've liked, and you want to know more.
Among other things, this client wanted to find out if users knew how to buy tickets to visit the FC Barcelona museum directly from their website.
What we asked of them was:
Imagine that you are visiting the city and you want to visit the club's facilities. Find the type of tickets offered and choose the one that interests you the most. Once you have chosen it, tell us your choice.
‼️ We soon discovered that the word "facilities" caused confusion and rephrased it to say "museum".
4️⃣ Set up environment
Here again, numerous things need taking care of.
The most time-consuming aspects certainly relate to the devices and applications used for the tests. They have to work. And we have to know how they work!
This includes how to create a new usability project, how to operate and switch between desktop and mobile recording, how to enable eye-tracking, how to adjust the cameras recording the room, ensuring the mics are working, ...
If some time has passed since we used them last, we have to count in some time for refreshing our know-how and making a test run.
We also have to load the note-taking document, set the temperature for a pleasant ambiance, prepare the consent form, a pen, and water for both the participant and the facilitator.
🏀 In practice
For these tests, I had to learn to work with Morae. How to create new projects and what to do when switching between mobile and desktop tests.
These tests were performed with eye-tracking. So additionally to Morae, I had to handle another program called Tobii.
Tests with eye-tracking are costly, and therefore rare. The consequence is that we forget how the technology needed to run them works. This is what happened in our case. It took us several hours over a couple of days to figure out its how-to's.
While we were at it, I took notes and prepared a step-by-step document afterward for future reference.
5️⃣ Conduct the test
The test phase starts by welcoming the participant, introducing oneself, and explaining what's about to happen in the next 60 to 90 minutes. We need to inform them that the session (image and sound) will be recorded and the material used for the purpose of this project only. We also have to make sure that they'll sign the consent form.
It's important to make the participant feel comfortable and safe so they'll behave and act as naturally as possible. This will help us obtain more and better insights.
Once the formal aspects are out of the way, we offer water and start with some small talk aimed at getting to know the participant and their habits a little bit: what do they do for a living, how long have they been at it, what do they like doing in their free time...and try to steer the conversation towards the topic of the test.
This introductory part shouldn't take up more than 15'.
This is when the participant performs the task(s) we've designed for them. We have to make sure they understand what they're expected to do so they'll be able to perform the tasks on their own. This is especially important if we're performing tests with eye-tracking
which should be entirely no-think-aloud.
Tests I've run included several tasks and were performed both on prototypes and working websites, sometimes even in the same session, as well as a card sorting exercise.
At the end of the test, the participant is asked to describe the digital process in their own words, value the best vs the worst of it, and what his personal wishlist for the product would be.
We wrap up the session by thanking the participant for their time and sharing valuable information with us. If applicable, this is also the time to hand over the voucher compensating them for their participation in our test.
🏀 In practice
Oysho & FC Barcelona
As with anything when you're a newbie, it takes some rounds to work out the kinks of usability testings.
‼️I found myself either spending too much or not enough time with the introduction and getting to know the user.
‼️When performing the tasks, I sometimes thought I was cutting them off too soon. But when letting them go on, I wondered that it might be beyond the research goal.
‼️And at times, I was unsure what to do when a participant had doubts about how to proceed: How do I best intervene without guiding them?
All in all, however, I'm happy to say that I had a great learning curve and that I was able to produce some good insights!
6️⃣ Analyze data
One might think that once we're done with the tests, we may sit back and relax. Quite the opposite!
Now the real work starts. The work that will help us better the product we just tested, thus making it more user friendly.
We have to gather and analyze the information contained in the test protocols and recordings, and group the insights. We have to identify behavioral patterns and translate our findings into actionable improvements.
Our findings may relate to usability, interaction, writing issues, amongst others. We may classify them by severity.
🏀 In practice
This is where we go back to Step 3️⃣ and try to find answers to the research questions.
Regarding the Oysho Sport menu, the tests showed clearly that users did not identify the corresponding tab as a separate menu where they could search for products.
As a consequence, the company applied the change we suggested in our final report (see image below).
The question we were trying to answer was whether users knew how to buy tickets to visit the FC Barcelona museum directly from their website.
The short answer is "not easily".
We did make several recommendations in our final report, which to date haven't been implemented.
Curious if you can find it? Test yourself here.
7️⃣ Report results
Last but not least, our findings and proposed improvements need to be accurately communicated to our client and team.
Regardless of the format (i.e. sending a document via email, presenting in person in front of stakeholder or remotely online), we have to make sure our report provides the context on the entire process and specifically on the usability tests.
It should be concise and clear, free of spelling and other errors, and visually appealing.
Below you'll find a few slides of one of our reports.
🕹 Control the technology
This might seem obvious. And it is.
Still, I found myself in a situation where I was a freshwoman, and my colleagues hadn't used some of the tools in some time. This caused quite some anxiety both before and during the tests.
We did take the time to go through all the steps, and I even prepared a step-by-step guide and rehearsed the process a few times. However, we couldn't avoid technical errors and human mistakes during the tests.
We managed the situations well, and the participants weren't disrupted by it in their performance. Nevertheless, next time I'll make time between tests to thoroughly check cables, programs, cameras...to avoid unnecessary stressful moments for all.
⏰ Space the sessions
Our schedule always foresaw four tests per day.
And this works well for short usability tests with few participants and reduced content.
For bigger projects, however, we realized that the quality of both the interviews and the protocols decreases after the third test.
Even though spreading the tests over more days may increase the overall cost of the usability tests (more days = more substitute participant costs), I believe it pays out.
On the one hand, we may employ the time between tests to review the interview protocols. If we identify patterns, we can follow up on them in the next interviews.
On the other hand, the fresher the interview facilitator and note-taker are, the better interviews and protocols they produce.
And these provide, after all, the material for potential usability improvements.
So rather than viewing one or two days more of testing as a waste of time and money, it should be considered an investment in better results and ROI.
🎨 Find your own style
The usability tests I did for Oysho and FC Barcelona were my first ones.
Technical challenges apart, there was the nervosity of how to...everything.
How do I greet the participant, how do I kick off the test, should I be more formal, more casual, adapt my attitude to the participant, where should I sit, am I sitting too close, not close enough, ...
I was lucky enough to work with and learn from colleagues with very different styles. This inspired me to play with a variety of approaches myself. It was like putting on dresses and testing which one suited me better!
After three rounds of testing, I'm happy to say that I'm a bit closer to my style: approachable and natural.
💃 Enjoy it!
Regardless of your style, I've come to believe that enjoying the usability test from beginning to end is key for generating great insights.
Being genuinely interested in the person you have in front of you, and curious about what makes them tick, helps them feel comfortable and willing to share more.