Interviewing Users. Steve Portigal

Чтение книги онлайн.

Читать онлайн книгу Interviewing Users - Steve Portigal страница 7

Interviewing Users - Steve Portigal

Скачать книгу

(such as Home, Work, In the Car, and so on), and they asked us to uncover the unmet needs that people had in those particular environments. It turned out that the specific within-environment needs people had were just not that big a deal, but what people really struggled with was moving between environments, or moving between contexts: from being alone to being in a social situation, from being stationary to being mobile, and so on. These were the real challenges for people. For example, if you’ve worn one earbud and let the other dangle so you could stay somewhat engaged, you’ve dealt with this particular issue.

      So, we excitedly reported to our client that we had found the “real” problem for them to solve. We were met with uncomfortable silence before they told us that they had committed organizational resources to addressing the problem as it currently stood. In our enthusiasm, we had trouble hearing them, and for a few minutes, the conversation was tense.

      Finally, we stated definitively that we had learned some specific things about the environments, and we saw a rich and complex opportunity in this new problem. And that was all it took. We delivered findings about each environment, and then we delved into the harder problem. It turns out that our client was eager to innovate, but they just needed to have their initial brief addressed. It became an important lesson for me: Reframing the problem extends it; it doesn’t replace the original question.

      There are numerous ways to gather data about users: usability testing, A/B testing, quantitative surveys, Web analytics, interviewing, focus groups, and so on. For the closest thing to a “Grand Unified Field Theory of User Research,” see these examples by Elizabeth B. N. Sanders (see Figure 1.3) and Steve Mulder (see Figure 1.4). Both do a nice job of creating an organizing structure around the surfeit of research techniques we are blessed with.

Image

       ELIZABETH B. N. SANDERS, MAKE TOOLS, LLC 2012

Image

       STEVE MULDER

       NOTE EXAMPLES OF USER RESEARCH APPROACHES

      • Usability testing: Typically done in a controlled environment such as a lab, users interact with a product (or a simulation of a product) and various factors (time to complete a task, error rate, preference for alternate solutions) are measured.

      • A/B testing: Comparing the effectiveness of two different versions of the same design (e.g., advertisement, website landing page) by launching them both under similar circumstances.

      • Quantitative survey: A questionnaire, primarily using closedended questions, distributed to a larger sample in order to obtain statistically significant results.

      • Web analytics: Measurement and analysis of various data points obtained from Web servers, tracking cookies, and so on. Aggregated over a large number of users, Web analytics can highlight patterns in navigation, user types, the impact of day and time on usage, and so on.

      • Focus group: A moderated discussion with 4 to 12 participants in a research facility, often used to explore preferences (and the reasons for those preferences) among different solutions.

      • Central location test: In a market research facility, groups of 15 to 50 people watch a demo and complete a survey to measure their grasp of the concept, the appeal of various features, the desirability of the product, and so on.

      Interviewing isn’t the right approach for every problem. Because it favors depth over sample size, it’s not a source for statistically significant data. Being semi-structured, each interview will be unique, making it hard to objectively tally data points across the sample. Although we are typically interviewing in context, it’s not fully naturalistic. A tool that intercepts and observes users who visit a website is capturing their actual behavior, but sitting with users and having them show you how they use a website is an artifice.

      Interviews are not good at predicting future behavior, especially future purchase intent or uncovering price expectations. Asking those questions in an interview will reveal mental models that exist today, which can be insightful, but won’t necessarily be accurate.

      But interviewing can be used in combination with other techniques. In a note earlier in this chapter, I described how a quantitative study helped focus our contextual interviewing and observations. In other situations, we’ve used an exploratory interviewing study to identify topics for a global quantitative segmentation study. We’ve combined a Central Location Test (where larger groups watched a demo in a single location such as a research facility and filled out a survey) with in-home interviews simultaneously and used the results of both studies to get a deeper understanding of the potential for the product. It can be valuable to combine a set of approaches and get the advantages of each.

      Is interviewing considered to be user research? Is it market research? Is it design research? I can’t answer those questions any better than you can! The answer is: it depends. Whether or not you ally yourself or your methods with any one of those areas, you can still do great work uncovering new meaning and bringing it into the organization to drive improvement and growth. At the end of the day, isn’t that what we care about? I’ll let someone else argue about the overarching definition matrix.

      Much of the technique of interviewing is based on one of our earliest developmental skills: asking questions (see Figure 1.5). We all know how to ask questions, but if we asked questions in interviews the way we ask questions in typical interactions, we would fall short. In a conversational setting, we are perhaps striving to talk at least 50 percent of the time, and mostly to talk about ourselves. But interviewing is not a social conversation. Falling back on your social defaults is going to get you into trouble!

Image

      Interviewing users involves a special set of skills. It takes work to develop these skills. The fact that it looks like an everyday act can actually make it harder to learn how to conduct a good interview because it’s easy to take false refuge in existing conversational approaches. Developing your interviewing skills is different than developing a technical skill (say, milkshake-machine recalibration) because you would have nothing to fall back on if learning about milkshake machines. With interviewing, you may need to learn how to override something you already know. Think of other professionals who use verbal inquiry to succeed in their work: whether it is police officers interrogating a suspect or a lawyer

Скачать книгу