Conducting Test Sessions
You can either launch a usability-test session immediately or schedule it to begin at a particular time. It is clear that, at least in part, the design thinking behind whatusersdo is to help embed usability testing into a company’s culture, and this is a laudable goal. In my experience, the length of time it takes to run usability tests varies, but the time it takes to conduct ten test sessions is typically on the order of 24–48 hours. For my study, each test-session video was around 20 minutes long.
Reviewing Session Videos
If you’ve never reviewed the videos for remote usability–test sessions, here’s a taste of what to expect. Some of the reviewers were absolute stars, really going the extra mile to give detailed feedback on the user journey. A few reviewers—thankfully in the absolute minority—were less helpful and failed to follow the instructions.
However, the positive is that it’s possible to give feedback on reviewers and, if they fail to follow your instructions, you can redo the test with a different reviewer. It’s also possible to reject reviews for other reasons—such as poor audio quality—but fortunately, I’ve not had to do that. It’s worth noting that, given the detailed instructions I occasionally needed to provide, my rejection rate of reviewers was possibly higher than the average! When I’ve needed to redo sessions, the replacement reviewers have become available very quickly, and I’ve never had a problem with a replacement reviewer.
The biggest problem I’ve had when reviewing test-session videos has been my frustration with users. As a UX designer, it can feel heretical to say this, but sometimes users do dumb things. When you’ve spent several hours reviewing user interactions, listening to comments, and—most frustrating of all—listening to participants read out screen copy and instructions, you can get a little jaded, so it’s best to pace yourself. As you watch videos, you can timestamp the start and end of interesting interactions and make annotations.
Feedback on Whatusersdo
It’s clear that the service is designed more for non-specialists than for UX professionals. While it is possible to export the raw data for a set of usability tests, this capability is a little bit hidden, in favor of a PDF report in a more traditional format. A recent change to the service is its prioritization of charts that show task-completion rates, time on task, and perceived ease of use over your being able to get straight into the data. While I appreciate the value this would provide to a non-specialist, as someone who wants to get down and dirty with the data, this is a small barrier. That said, I do approve of the design thinking behind it: to prioritize design decisions in favor of the non-specialist rather than the specialist!
Whatusersdo have been pretty good about both requesting and listening to my feedback. Each customer has an account manager who responds to requests. Mine invited me to provide direct feedback via a Skype call, during which I raised several points, and they acted on them very shortly afterward. While this may have been a happy coincidence, being listened to still gave me a sense of satisfaction, so I’m one happy customer!
Whatusersdo is not a perfect tool. It would be great to have the ability to pick a particular social group rather than ABC1 or C2DE. I’d really like to be able to speed up the playback of videos, while still being able to listen to the audio. I think this would let me process the reviewers feedback much more quickly—listening just for verbal feedback and skipping the instructions. However, I’ve been very favorably impressed with whatusersdo. It provides a cost-effective, reliable approach to getting detailed user feedback.