Remote Research. Tony Tulathimutte

Чтение книги онлайн.

Читать онлайн книгу Remote Research - Tony Tulathimutte страница 6

Remote Research - Tony Tulathimutte

Скачать книгу

a hassle for you to just bring people into a lab, then by all means bring ’em in. Sometimes this decision can be a tough call; users in the developing world may have limited access to the Internet, for instance, so you’d have to decide whether it’s worthwhile to fly over and talk to users in person, or to find people from that demographic in your area, or to arrange for the users to be at a workable Internet kiosk to test them remotely.

      For clarity’s sake, let’s talk about some clear-cut cases of things you should and shouldn’t test remotely.

      Remote testing is a no-brainer for Web sites, software, or anything that runs on a desktop computer—this is the kind of stuff remote research was practically invented to test. The only hitch is that the participants need to be able to use their own computer to access whatever’s being tested. Other Web sites besides your own are a cinch: just tell your users during the session to point their Web browsers to any address you want. If you’re testing prototype software, there needs to be a secure way to digitally deliver it to them; if it’s a prototype Web site, give them temporary password-protected access. If the testing is just too confidential to give them direct access on their computer, you can host the prototype on your own computer and use remote access software like VNC or GoToMeeting to let them have control over the computer. There’s almost always a way to do it.

      The stuff you test doesn’t even have to be strictly functional. Wireframes, design comps, and static images are all doable; we’ve even tested drawings on napkins (really). Just scan them in to a standard image format and put them on a Web site. Make sure the user’s browser doesn’t automatically resize it by using a plain HTML wrapper around each image. There are also plenty of software solutions (like Axure and Fireworks) that can help you convert your images to HTML.

      Can you test programs that require users to enter personal information? Yes, but make sure to give your participants a way to enter “dummy” information wherever they’re required to enter sensitive or personally identifying information. (According to Rolf Molich, people act differently when using dummy information, so bear that in mind.) If you require the participant to use real personal information, be sure to obtain explicit consent right at the beginning of the testing session (an issue covered in Chapter 4); you don’t want to spend 20 minutes on the phone with a user only to have to terminate the study over privacy issues.

      Most remote research tools (screen sharing, recording, chat, etc.) are suited for a computer desktop environment, so physical products are harder to test remotely. We’re just beginning to see mobile device and mobile interface research become feasible, and we’ve researched interfaces like cars and computer games using some remote research methods. Plus, webcams, Web video streaming, and wireless broadband are all becoming more accessible, so there’s plenty of hope. But physical interfaces will require you to come up with some creative solutions and workarounds beyond the standard remote desktop testing approach. These approaches may or may not be worthwhile; see Chapter 9 for examples of some of these alternative remote approaches.

      Case Study: Lab vs. Remote

      By Julia Houck-Whitaker, Adaptive Path

      (and Bolt | Peters alum)

      In 2002, Bolt | Peters conducted two remote studies on the corporate Web site of a Fortune 1000 software company. Both studies used identical test plans, but one was executed in a traditional usability lab, whereas the other was conducted remotely using an online screen sharing tool.

      Summary

      Our comparison showed key differences in the areas of time, recruiting, and resource requirements, as well as the ability to test geographically distributed user audiences. Table 1.1 summarizes the key differences we found comparing the two methods. There appeared to be no significant differences in the quality and quantity of usability findings between remote and in-lab approaches.

      Table 1.1 Overview Comparison of Lab and Remote Methods

      

http://www.flickr.com/photos/rosenfeldmedia/4286397757/

Table01.01.png

      Detailed Comparison of Methods

      Tables 1.2, 1.3, and 1.4 break down the process for each of the recruiting, testing, and analysis phases, respectively. The left-hand column describes the lab study details; the right-hand column describes the remote study details.

      Table 1.2 Lab vs. Remote Recruiting

      

http://www.flickr.com/photos/rosenfeldmedia/4287138014/

Table01.02.png

      Recruiting for the lab-based study was outsourced to a professional recruiting agency (see Table 1.3). Ten users were recruited, screened, and scheduled by G Focus Groups in San Francisco, including two extra recruits in case of no-shows. Recruiting eight users through the recruiting agency took 12 days. Agency-assisted recruiting successfully provided seven test subjects for the lab study; the eighth recruit did not fulfill the testing criteria.

      Recruiting for the remote study was conducted using an online pop-up from the software company’s corporate Web site. The recruiting pop-up, hosted by the researchers, used the same questions as the G Focus Groups’ recruiting screener. Users in both studies were selected based on detailed criteria such as job title and annual company revenues. Respondents to the online screener who met the study’s qualifications were contacted in real time by the research moderators. The online recruiting method took one day and yielded eight users total from California, Utah, New York, and Oregon. Normally, the live screener requires four days of lead time to set up, but in this case it was completed for a previous project so setup was not necessary.

      Table 1.3 Lab vs. Remote Environment

      

http://www.flickr.com/photos/rosenfeldmedia/4287138116/

Table01.03.png

      The lab study was also conducted from the software company’s in-house usability lab. The recruits for the lab study went to the lab in Pleasanton, California to participate and used a Windows PC. In addition to users’ audio and screen movement capture, users’ facial expressions were also recorded.

      The remote usability study was conducted using a portable lab from the software company’s headquarters in Pleasanton, California. The live recruits participated from their native environments and logged on to an online meeting allowing the moderators to view the participants’ screen movements. The users’ audio and screen movements were captured.

      The lab study uncovered similar issues of similar quality and usefulness to the client when compared with the remote study results (see Table 1.4). The remote study uncovered usability issues of high value to the client. The lab method uncovered 98 key findings, compared with 116 findings in the remote study (not a statistically significant difference).

      Table 1.4 Lab vs. Remote Findings

      

Скачать книгу