« March 2010 | Main | July 2010 »

June 23, 2010

Adventures in remote usability, Part 1: Why remote?

I'm assuming that you're already sold on the basic premise of why one ought to observe members of one's target market using one's software. Personally I drank the user-centered design Kool-Aid at CMU in the late 90s. The biggest problems I've dealt with in my career aren't about "You should watch users and fix the problems they have!" Mostly I run in to logistical issues. How do you execute the different types of data collection that you need for a clear and accurate picture of user behavior? How do you track those findings and use them to inform actual product decisions, often over a period of several months?

I've dealt with our user research and user info needs for the past year and a half at ChoiceVendor . Having to manage a usability program is nothing like plunking down on the couch in one of Google's usability lab observation rooms and kicking back while one of the researchers proctors the session from the other side of a one-way mirror.

For the sake of clarity, I define remote usability as "any method of gathering usage and behavioral observations about site users who are not physically coming to your office." In other words, it doesn't just mean Nielsen-style discount usability studies that gather qualitative feedback from 5 or 6 people.

Remote studies get you a little closer to users' real experiences.

We all feel more comfortable when we're in our own environment rather than in someone else's office. I've had users who put me on hold to take another call, who have to move into another room because their kid is yelling, who ask me to hold on just a moment to answer a question from an employee. They're also on their own computer, which they know how to use and operate, and it's got their email and passwords set up already.

We have special users and can't test just anyone.

We have a moderately specialized user base: people who own businesses, or are managers, or are some other sort of purchasing decisionmaker at their company. Because of this, we can't plunk ourselves down in a coffee shop and wave $10 gift certificates at people in exchange for 15 minutes of study time. (That's what we did for Mopho.)

It's true that we probably could have found a decent number of issues just by doing coffeeshop usability. But the problem with doing hallway usability is that most of the people in our hallways are either in the software industry or they live and work in the Bay Area: the epicenter of the technology world. It's not that I don't want these people to use my site -- it's that these people have about tenth of the problems as non-Bay Area professionals. I know that they can figure out how to use the site, and they're flexible enough to not get too frustrated when something goes wrong.

So I used a market research firm (Hagen / Sinclair) to recruit qualified users from anywhere. Even these participants were relatively sophisticated: they had to have high-speed internet, a relatively modern computer capable of running GoToMeeting, and they had to be sufficiently articulate to pass the screening questions.

This costs about $135 per user, plus $50 - $100 in Amazon.com gift certificates depending. Local participants were welcome to come into the office; everyone else we used GoToMeeting to screen share.

I specifically wanted people who had never used or heard of our site before, which is why we didn't recruit from our existing user base on most of these. We wanted qualitative feedback not just on the site UI, but on our brand, messaging, and the like. Plus, when we talk to actual members of our target audience, we also get to ask them lots of interesting contextual-inquiry and focus-group types of questions.

Note: One of our participants couldn't figure out how to add the gift card codes to her Amazon account. Yet she owns her own business: she's smart at what she knows, and just isn't an expert at the Internet.

We were able to get a higher quality of user by going remote.

Again with the theme: our user base is specialized, so I wanted to remove all other barriers to participating. If you require people to come into your office, you restrict your pool to people in your geographical area who are also able to take 2 hours out of their day to come visit. (Half an hour on either side of the 1-hour study for transportation, parking, etc.)

Few people with typical business-type jobs (much less business owners or VPs) are going to ever do that, especially for $100 in Amazon gift certificates.

Personally, I'm usually not eligible as a study participant because I work in UX, but occasionally I am part of the target audience for a given study. But the last time, they wanted me to phone someone on a weekend to set up a time to drive to Cupertino by 6pm later that week. Who doesn't have better things to do with their time than that?

Facial expressions are bullshit.

I'm with Nate Bolt on this one. I don't think I've seen the participant's face during a study in, let's see, the past 8 years? You don't need to. Everything you need to know about the participant's emotional state you can tell from their voice.

Plus, the thought of adding one more technical barrier to a study -- requiring the user to have a webcam or video, and getting it properly turned on and functioning -- in the course of an hour long study that also requires voice, an explanation of the study itself, screen sharing, and recording? Not worth the effort. I'd rather spend the time asking them more questions about their business and how they currently accomplish the tasks that we're trying to support.

Thus, I rejected any software or testing system that made a big hype out of its video recording capabilities.

Remote can be faster and cheaper.

It's generally a bit more work to set up a remote study -- technically you have to have more pieces in place. But for unproctored remote usability, you can run a dozen people through a 5 minute test, make some changes, and run another dozen people. You don't have to bring someone in and come up with an hour-long study to get full use of their time.

Conclusion

For the past year and a half I've tried a number of different approaches & methods. This series of articles outlines pros, cons, funny stories, and what I've learned from different techniques.

Next: Adventures in remote usability, Part 2: GoToMeeting