Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Rapid User Research - a talk from Agile 2013 by Aviva Rosenstein


Published on

Doing user research before and during development helps inform your choices about strategy (what to build) as well as tactics (how to build it)-- and it doesn't have to slow down your development process . In fact some rapidly executed research can speed up your time to market by reducing the need to refactor late in a project.

This presentation includes practical information to help product owners and developers quickly get inside the heads of their users, validate product ideas and improve the usability of their software at warp speed. The talk included tips and techniques for recruiting research participants, shadowing and interviewing users effectively, getting valuable feedback on product concepts and information architecture, and rapidly iterating on the user interface to improve usability. They discussed remote testing tools that help teams evaluate if users can successfully achieve their goals with their designs, and reviewed best practices collecting feedback from users after launch.

Published in: Technology, Business
  • Login to see the comments

Rapid User Research - a talk from Agile 2013 by Aviva Rosenstein

  1. 1. Rapid User Research #rapidresearch A part of the HotHousing product kickoff & exploration framework Aviva Rosenstein, PhD UX Consultant, Evolve Beyond @avivaux Gabrielle Benefield CEO, Evolve Beyond @gbenefield 1
  2. 2. Why user research? 2
  3. 3. Why user research? 3
  4. 4. Customers don’t always share our knowledge, values, assumptions or interests. Tech workers mostly here User Research helps bridge the gaps between developers and users End users mostly here 4
  5. 5. You might need user research if you hear or see these phrases: “… I think they want to do this….” “I’d want it to work like this…” “"ey asked for this feature...” “I assume they want…” 5
  7. 7. Methods to use depends on where you are in your product lifecycle 7 What do we build? How do we build it? How did we do?
  8. 8. And on what answers you need. Understanding actual user behavior: What are users doing? When? Where? Understanding reasons for behavior Why are users doing that? Understanding user attitudes and opinions How do your users feel about doing it? 8
  9. 9. Different methods provide different insights: mix ‘em up Understand user’s experience, values, desires, environments Measure or model behavior, predict outcomes 9
  10. 10. A few proven rapid methods •  Interviewing & Shadowing Users •  Concept Tests What do we build? •  Card Sorts, Tree Tests & Click Tests •  RITE studies How do we build it? •  Compare key metrics pre and post •  Product Experience Feedback How did we do? 10
  11. 11. Rapid research requirements 1.  Executive and team buy-in 2.  Feedback from the right people 3.  Efficient data collection & analysis 4.  Actionable, understandable insights 5.  In-time reporting 11
  12. 12. • What roles do they play in relation to the product?User Roles • How would you describe them? • Any relevant skills & knowledge? Characteristics • What’s special about their situations?Context What do we build? Understanding users 12
  13. 13. • What are they trying to achieve? • How do they feel about these tasks? Goals • What do they need to accomplish those goals? • What needs aren’t being met? Needs • What are they doing now? • What can you improve?Pain Points What do we build? Understanding users 13
  14. 14. Empathizing with users’ pain and frustration 14 Pain scale (adapted from Hyperbole and a Half) 1   2   3   4   5  
  15. 15. Being heard Cash or goodies Knowing what’s coming 15 Finding the right people: push the right lever
  16. 16. Interviewing Used to – Explore needs, feelings, opinions – Obtain recollections and rationales – Gather feedback 16
  17. 17. sweater and asked you to wear it. •  Image copyright Wil Wheaton CC BY-NC-SA 2.0 Hey, I knitted this sweater for you!     Gosh! Thank you! (Ugh, it’s horrible.) 17
  18. 18. Behavior: Observing vs. Asking What people say, what people do, and what they say they do are entirely different things.” 18
  19. 19. 19Photo  by  Eric  Allix  Rogers,  permission  granted,  and  available  under  a  Crea<ve  Commons  A=ribu<on-­‐Noncommercial  license.     Source  h=p://
  20. 20. Task Card: [task description] Performed by Role: [role name] Context of Use: q  Where and when is it performed? q  In what environment? q  What corporate culture? q  Where in development process? q  Direction of information flow? q  Device constraints/ media channels? q  Needs for q  Auditability q  Accuracy & Credibility q  Confidentiality Task Characteristics: q  Frequency q  Regularity q  Continuity q  Intensity of use q  Timeframe to act q  Complexity q  Predictability q  Who controls the process? q  Legal/regulatory restrictions q  Operational/safety risks q  Other roles involved: 20
  21. 21. Shadowing Users
  22. 22. 22Photo  by  Jane  Mejdahl,    used  under  CC  BY-­‐SA  2.0.     Source  h=p://
  23. 23. ROLE: Business Owner TASK: Approve visual design direction CONTEXT: Waterfall dev process. Supervises multiple product managers. Frequently mobile; uses iPhone. CHARACTERISTICS: Short attention span. Under significant time pressure. Focuses on visuals and metrics. CONTENT CRITERIA: Brief, clear presentation in common formats consumable on mobile devices ID Dev Mgr VzD PM BO TASK:Approve visual design direction Communicating insights from a shadowing session 23
  24. 24. What do we build? Validating product and design ideas How do they feel about our concept(s)? Do they think our ideas make sense? Will our concept work for them? What features do users value? 24
  25. 25. Concept Interviews Stimulate discussion with a narrative, storyboard, UI concept, prototype, demo, video or walkthrough Used to: Explore needs, rationales, and attitudes Gather feedback on ideas 25
  26. 26. Mackenzie  is  building  a  data-­‐driven  site  and  isn’t  sure   about  the  proposed  schema. 26
  27. 27. Are these tables the right ones? Mackenzie  documents  the  schema  but  wants  to  get   approval  from  her  manager. 27
  28. 28. Mackenzie  sends  her   product  manager  a  link   to  the  schema. by Martin Hardee / Sun 28
  29. 29. Collecting responses Ques+on P1 P2 P3 Role  or  relevant  characteris<c   Does  concept  work  for  them?     Features  valued:     Posi<ve  or  Nega<ve  Reac<on   Comments: 29
  30. 30. How to organize the site architecture What labels to use on the navigation What kind of navigation do we use? What visual design approach to use Do users understand how to use the site to accomplish their goals? Does this product meet our quality standards (prior to launch)? How do we make this usable? How do we build it? 30
  31. 31. Tools for Rapid Remote Testing 31 ZURB Apps UX Punk Optimal Workshop
  32. 32. Open online card sort interface 32
  33. 33. Analysis: Darker clusters, more associations 33
  34. 34. 34 Data clustering results
  35. 35. RITE Studies Stands for Rapid Iterative Testing and Evaluation. USE IT TO: •  Identify and resolve usability issues in an interface, increasing levels of fidelity through the process. •  Improve and validate ease of use.
  36. 36. RITE Studies 36 1) Obvious Cause & Solution With Quick Fix 2) Obvious Cause, But Solution Needs Time to Design 3) Problem With No Obvious Solution (Keep Watching) 4) Issues Related To Test Script Or Study Protocol Include participant #, issue, fix Include # of participant, issue, fix Include # of participant and issue Include # of participant and issue P#1 didn’t scroll down to see CTA- move up? P#2 expected to download support information – create content library P#2 didn’t know that link was clickable – add underline on hove? P#3 unable to locate support link P#1 test script set expectation for discounts, revise P
  37. 37. RITE Studies CONSIDERATIONS •  Participants must represent and/or share key characteristics with target users •  Conduct session in person or remotely using screen sharing applications. •  Decision-makers must attend all sessions because decisions are made after each one. •  Prototypes and task scripts may change during the study, so don’t collect success metrics that depend on experimental rigor (i.e. time on task, error rate.) •  Use either concurrent or stimulated retrospective think aloud technique to understand users’ expectations for and understanding of design elements. •  Number of participants may vary depending on number of iterations needed. •  Leave some time between sessions to allow for debriefing and making design changes. Try scheduling a day between every three or four sessions to allow for design changes that require additional thought or time to implement. ROLES 1 or 2 participants per session, 1 facilitator, stakeholder observers. MORE PRACTICAL INFORMATION Using the RITE Method to improve products: a definition and a case study. .Medlock, M. C., et al. (2002). Usability Professionals Association, Orlando FL July 2002
  38. 38. How did we do? Evaluating success after launch Do users understand how to accomplish their goals? Are users satisfied with it? Did we increase conversion/sales? Are we keeping users engaged?... 38
  39. 39. SPLIT TESTING Page 39 Surface Skeleton Structure Scope Strategy • Interchangeable design elements • Modules within a grid • Complete features or versions • Self-contained feature within an existing platform • I am so, so sorry.
  40. 40. Efficient Cumulative Consistent Bias Resistant Sharable Retainable Contextual Message Boards                             Ad Hoc Surveys                             Customer Calls                             Focus Groups                             PXF Survey                             Collecting subjective product experience feedback 40
  41. 41. Product Experience Feedback Requirements: q Provides actionable insights q Easy to share information with team members q Knowledge retained in the company q Doesn’t pollute the user experience q Easy to implement, uses resources efficiently q Contextual to specific feature of interest q Consistent across product lines q Measurable, trackable progress q Construct validity, resistance to bias 41
  42. 42. Product Experience Feedback Survey Includes: Open ended responses ü  Problem reports ü  Suggestions ü  Praise ü  Other comments Product Usability Scale measuring perceptions of ü  Efficiency ü  Utility ü  Performance ü  Learnability ü  Satisfaction ü  Integration 42
  43. 43. Example template flow: Problem filter 43
  44. 44. Collect bugs first; then group feedback by type 1.Have you experienced any problems or errors when using [NAME OF FEATURE] in [PRODUCT NAME]? (Yes/No-randomized) 2. Please describe any problems or errors you've noticed while using [NAME OF FEATURE] 3. What, if anything, do you like most about [NAME OF FEATURE]? 4. Do you have any ideas or suggestions for improving [NAME OF FEATURE]?   5. If there is anything else you'd like us to know about the [NAME OF FEATURE] in [PRODUCT NAME], tell us here. 44
  45. 45. Product Experience Ratings: subjective experience metrics 6. Please rate how strongly you agree or disagree with each of the following statements: 45 1) Strongly Disagree 2) Disagree 3) Neither agree nor disagree 4) Agree 5) Strongly Agree) a.  I  expect  to  use  [NAME  OF  FEATURE]  within  [PRODUCT  NAME]   frequently.     b.  [PERFORMING  KEY  USER  STORY]  with  [NAME  OF  FEATURE]  is  easy   and  straighdorward.       c.  I  am  sa<sfied  with  the  [NAME  OF  FEATURE]  in  [PRODUCT  NAME],   d.  I  had  to  learn  a  lot  of  things  before  I  could  use  the  [NAME  OF   FEATURE]  effec<vely.     e.  The  [NAME  OF  FEATURE]  works  seamlessly  with  the  rest  of  the   [PRODUCT  NAME]  applica<on.   f.  When  I  use  the  [NAME  OF  FEATURE]  it  feels  quick  and  responsive.       Utility Ease of Use Satisfaction Learnability Integration Performance
  46. 46. Rapid + Valuable = 6 steps 1.  Engage stakeholders early and often 2.  Keep plan focused 3.  Get a representative sample 4.  Ask questions and listen without bias 5.  Collect data efficiently (but follow up hunches and surprises) 6.  Share actionable findings 46
  47. 47. "anks, y’all. Send your flames, ideas, and comments to @avivaux or