Articles

Remote Usability Testing - A Christmas ghost story

David Humphreys
December 01 2015

Ho! Ho! Ho! Merry Christmas! Uncle Dave here with a Christmas story about UX lessons relearned, the ghost of an old client to bring you a bit of UX joy.

The ghost of client's past

We were recently contacted by a client we had not heard from for years. They had been building their in-house UX capability over the years. Then one bright sunny summer's morning in mid-December we received a call that went something like this:

Client: Hi, we have a couple of different mobile designs. We are divided. We need an expert review to tell us which to use. We need an answer by COB tomorrow ...and don't want to spend a lot of money.

Peak: Sure. We can do that ...we won't pick one design -but will catalog issues and help you make your mind up.

To us, it seemed straightforward, look at the two designs, apply our experience and knowledge and identify the issues. Until, a voice was raised in dissent, perhaps a Christmas angel (but more likely Tania) who said, "We need to test this. What if we are wrong?"

One of our core philosophies is 'Some testing is always better than none' so we told the client we would test their designs. But how could we do it, within time-frames and budget (i.e. in a 24 hour window)?

A saviour is (re)introduced

There are a few tools that we regularly use (Optimal Workshop's excellent suite for example) that do offer inbuilt online recruitment of participants but their lead times all seemed too slow for our needs.

However, we had noticed that fivesecondtest.com had broadened into a suite of products as part of the UsabilityHub and we knew that fivesecondtest.com had always been very good in generating responses. The suite now included a click testing tool and we hoped, with the rapid responses from its global user base that we would get a quick enough set of responses to meet our clients tight time frame.

We anticipated that most of the respondents would come from the US overnight - while we slept. Not being terribly concerned about any potential cultural differences, we set the test with 10 tasks per design, launched the test at 6pm and went home.

Sort of...

Because, before we finished for the day we already had half of our planned 20 or so responses for each task.. Just six hours behind Australia is India and results were flooding in from the Subcontinent. Again, there was not significant concern as English is a second language for many Indians, but it was an unexpected outcome nonetheless.

Expectations overturned

So what happened? If not a Christmas miracle, then certainly a Christmas reminder. Our first impressions had favoured one design over another. As is often the case, one design was more visually appealing and 'modern looking' but it didn't test as well when users attempted common tasks. The other design was also marginally more efficient when we looked at time on task. Our first impressions and assumptions about the most efficient and usable design were disproved. The testing highlighted a few usability issues with the "pretty" design and reminded us of the value of testing with users, even when you have extremely tight timeframes.

Lessons learned and relearned

What did we learn? I think we reschooled ourselves in some fundamentals as well as learnt some practical lessons on rapid usability testing.

  1. Some testing is ALWAYS better than none - Ultimately there were strong elements in both designs that should be incorporated into a hybrid design. But without the testing we would have overlooked some of the gems. Our initial reaction was that there was one that was better than the other, but the results from the testing told a different story.
  2. Efficiency is a better measure than preference - Don't ask what users think, observe what they do. We initially considered using Usability Hub's Preference Test which asks users which design they prefer. However, if users haven't tried to complete tasks on the site, they tend to base their choice purely on visual aesthetics. By using the click test we were able to measure efficiency (task completion, rate and average time) so were able to give more informed recommendations to our client.
  3. Plan your launch times - If it is important that you target particular users, from a geographic location, language group or region, then you need to plan your launch times. We don't think the 44% of responses from India influenced the results significantly but it may in certain circumstances.

Time Zones world map

We are going on holidays!

We will be shutting down for the holidays at 3pm 18th December 2015 and will be back in the office on the 4th of January 2016. We wish you a very Merry Christmas and a Happy New Year for 2016.

What are we reading and doing for Christmas?

Apart from Star Wars? Well, we might have a couple of things to keep our minds, and hopefully yours, active over the break:

99% Invisible - This wonderful podcast series explores all areas of design and the ways humans have designed and experienced the products of design. Start with the recent Episode 187: Butterfly Effects that explored how a design issue got George W Bush elected and Episode 161: Show of Force on how design helped win World War II.

Gamestorming - David attended a workshop with Dave Gray, one of the two authors, during the recent UX Australia conference and his ideas about using games, gamelike activities and story telling to elicit deep qualitative information from participants in research activities were wonderful. When you are finished with the book continue the journey with even more resources Dave and Sunni have collected on their blog.

Categories: Usability testing