Are you the sole User Experience Researcher in your organization? Do you struggle to get timely research insights and feedback for your stakeholders? Online research tools offer practitioners the ability to gather feedback quickly and asynchronously, without the need for direct facilitation or moderation.
In this presentation, we provide an overview of some of the many online research tools that are available for gathering quick, asynchronous feedback on requirements, designs, and stakeholder sentiment. We offer general guidelines for recruiting, planning, implementing, and analyzing feedback, and then present how to use specific methods that have proven particularly useful for design and requirements research.
How to effectively implement different online research methods - UXPA 2015 - Fadden & Bedekar
1. How to effectively implement
different online research techniques
for rapid unmoderated feedback
Niyati Bedekar
@nbedekar
Steve Fadden
@sfadden
Presented at UXPA 2015, San Diego Slides: https://goo.gl/X8dolV
2. Agenda
Online techniques
Method toolkit
Common requests and solutions
Case studies and templates
Effective practices
Image source: http://pixabay.com/en/modesto-california-scenic-trail-205544/
5. Who are you?
Years experience in
user research:
<1
1-2
2-5
5+
Image source: Karen Arnold (http://www.publicdomainpictures.net/view-image.php?image=45018)
6. Who are you?
Total number of
employees:
1-20
21-100
101-500
500+
Image source: Karen Arnold (http://www.publicdomainpictures.net/view-image.php?image=45018)
7. Who are you?
Most recent research
request?
Most common
research request?
Jot down
Image source: http://pixabay.com/en/photos/note%20paper/
12. What People Do
What People Say
Why &
How to Fix
How Many & How
Much
Behavioral
Attitudinal
Qualitative Quantitative
Rohrer, C. October 12, 2014. When to use which user experience research methods.
Retrieved from http://www.nngroup.com/articles/which-ux-research-methods/
Toolkit is growing
(Rohrer’s framework)
Image source: http://www.freestockphotos.biz/stockphoto/1772
13. Method (participant effort) Types of answers provided
Click Behavioral: Where to start or go next?
Preference Attitudinal: Compare between options
Recall Hybrid: What do you remember? What are your
first impressions?
Sentiment Attitudinal: How does this make you feel?
Embedded questions Hybrid: What happens next, and why?
How would you rate this?
Terminology/naming Attitudinal: What does something mean?
Commenting Hybrid: What comes to mind while reviewing a
concept/flow? OR Open feedback
Go-to methods
Image source: http://www.geograph.org.uk/photo/1911269
14. Method (participant effort) Types of answers provided
Card sorting Hybrid: What items belong together and what
should they be called?
Discussion groups / Focus
groups
Attitudinal: What comes to mind while reviewing
other feedback?
Unmoderated usability
testing
Hybrid: What do you expect? What do you do?
Why?
Additional methods to consider
Image source: http://www.geograph.org.uk/photo/1911269
16. “Finals week starts on June 1. Where would you first click to put a
reminder on your calendar?”
Click methods (Behavior: Where do users click)
UsabilityTools
17. “Describe what you would expect to see after you clicked the area in
the previous screen?”
Embedded question (Hybrid: What happens next)
Qualtrics
18. “Please click the variation you prefer. [after] Why did you choose it?”
Preference (Attitude: Which do you prefer)
Verify
19. “You will see a screen for 5 seconds. After reviewing the screen, you’ll
be asked questions about it. [after] What do you remember?”
Recall (Hybrid: What do you remember)
Verify
20. “Review this screen and think about how it makes you feel.”
Sentiment (Attitude: How does this make you feel)
Verify
21. “Do you find this design to be attractive?”
Embedded question (Attitude: How do you rate this)
SurveyMonkey
22. “Label each marker with what you would call the icon.”
Terminology/naming (Attitude: What does this mean)
Verify
23. “This design shows what happens when you click the ‘+’ icon.
Comment on areas you find confusing, problematic, helpful, usable.”
Commenting (Hybrid: What comes to mind)
Verify
27. Form groups of 3-5
Review common requests
Discuss how you typically research
Consider online solutions
Discuss pros/cons
Discussion: Research requests
Image source: http://en.wikipedia.org/wiki/Fischer's_lovebird
28. Reference (for Activity)
Image source: http://www.geograph.org.uk/photo/1911269
Method (participant effort) Types of answers provided
Click Behavioral: Where to start or go next?
Preference Attitudinal: Compare between options
Recall Hybrid: What do you remember? What are your
first impressions?
Sentiment Attitudinal: How does this make you feel?
Embedded questions Hybrid: What happens next, and why?
How would you rate this?
Terminology/naming Attitudinal: What does something mean?
Commenting Hybrid: What comes to mind while reviewing a
concept/flow? OR Open feedback
29. Group discussion: Share thoughts
● Problem
● Typical solution
● Online research solution
● Pros/cons
Discussion: Research requests
Image source: http://en.wikipedia.org/wiki/Fischer's_lovebird
31. Case Study 1: Evaluate new data export concept
Background
- New functionality for an existing
product
- Integrated with 3rd party software
- To be implemented ASAP
Goals
- “Boil the ocean” to learn if concept
was understood, desired, and usable
Methods
Embedded question
Critical incident
Embedded question
Comprehension rating
Commenting
On each storyboard panel, after
presenting full story
Embedded question
Open feedback, questions, and
expectations
32. “Consider the last time you had to export data. Describe why you
needed to export data, and list the steps you remember from that
process. (If you haven’t exported data before, or don’t remember the
last time, just skip to the next question).”
Embedded question (Critical Incident)
“I’m pretty old school, so I export
my credit card transaction data
about every quarter. My credit card
site has a button to export to CSV,
so I just click that and it downloads
to my computer.”
“We have our marketing, sales, and
inventory data in different systems.
I have to export data from each
system in order to combine it into a
spreadsheet for my stakeholders.
The export process is easy.
Combining the data is more
involved.
33. “Consider the concept presented on the next 4 slides. After reading
about the concept, you will be asked about what you found to be
confusing, problematic, useful, and appealing about the concept.”
1.
New concept scenario
2.
3.
4.
100%
35. Commenting (Identify strengths and weaknesses)
“You will now be shown each concept slide again. On each slide,
indicate anything you found to be particularly confusing, problematic,
useful, and appealing.”
1.
2.
3.
4.
100%“Doing this would
require a lot of clicks,
even for a small
number of columns.”
“You should embed best
practices for naming
here. Otherwise, the
result could be messy.”
“Will we be able to save
the mappings? That
could save time in the
future.”
36. “Any final comments, questions, or feedback you’d like to share?”
Embedded question (Open feedback)
“It’s great that you don’t have to
jump around different parts of the
system to do this. Very valuable to be
able to complete this from one
place.”
“Seems very clear to me. I think
anyone who has used [XYZ] would
be able to understand it too.”
“Hi, I wanted to
follow up to
reiterate that this
is a REALLY COOL idea
and it fills a much
needed requirement
for our use of the
product. Please
consider me for
future studies like
this, because we need
this functionality!”
37. Template 1: Exploring a new concept
NDA,
Confidentiality,
Demographics
Embedded
Question: Critical
incident to activate
[Present concept]
Video, illustration,
storyboard,
description
Embedded
Question:
Comprehension
rating, after
presenting concept
Commenting:
Concept slides
(storyboards work
well)
Embedded
Question: Open
feedback
38. Case Study 2: Identify problems and preferences for
calendar range selection tools
Background
- Tool developed without support
- Early stage prototype, only worked
within company firewall
- Team wanted feedback before further
refinement
Goals
- Recruit internal participants only
- Identify heuristic violations
- Gauge preference compared to
existing tools
Methods
Click
How would you start task
Commenting
(after using prototype) See
screenshots of tool in different
states
Preference
Compare tool to existing tool
Embedded Question
Explain preference and next steps
39. Template 2: Eliciting usability/heuristic feedback
NDA,
Confidentiality,
Demographics
Recall: What is
remembered? [or]
Sentiment: How
does this make you
feel?
Click: How would
you start this task?
Embedded
Question: What
would you expect to
see after clicking?
Commenting: Open
feedback, after
engaging
Embedded
Question: Usability
rating
40. Case Study 3: Redesign chart type & update visual
treatment
Background
- Existing component used frequently
by customers and loved by many!
- Not scalable
- Prone to misinterpretation
- Team wanted to test new designs
Goals
- Understand if users comprehend the
new design
- Gauge preference among 3 different
approaches (including existing)
- Mix of internal users and customers
Methods
Embedded question
Understandability of information
Preference
Among the various options
Commenting
Open feedback, expectations
41. Template 3: Redesigned visual treatment
NDA,
Confidentiality
Embedded
Question: to gather
understanding of
information on chart
(randomize)
Preference: Which
design do you
prefer? (randomize)
Embedded
Question: Why the
selected design?
Commenting: Open
feedback
Demographics
42. Case Study 4: Understand how people find content
Background
- Team assigned to build new system
- Wanted to create a system where
content was easy to locate
Goals
- Identify how users locate content
- Discover differences based on content
type
- Understand pain points to see if they
can be reduced or eliminated
Methods
Click
(for each method) Where do you
click first to locate this kind of
content?
Sentiment
What feeling is associated?
Commenting
Open feedback, expectations
Embedded Question
(after each method) What do you
find most/least usable?
43. Template 4: Understanding behavior and
expectations
NDA,
Confidentiality,
Demographics
Embedded
Question: Critical
incident to activate
Click: What do you
do first?
Sentiment: How do
you feel when you do
this?
Commenting: What
works well and not
well?
Embedded
Question: Open
feedback
62. Type of Test
Tools Click /
Suc-
cess
Prefer-
ence
Recall Senti-
ment
Ques-
tion
Termin-
ology/
Label
Com-
menting
Card
sorting
Discus-
sion
Unmoder-
ated
usability +
video on
website
Metrics
&
Results
Verify ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓
Usabilla ✓ ✓ ✓ ✓
Loop11 ✓ ✓ ✓
UserTesting.com ✓ ✓
UserZoom ✓ ✓ ✓ ✓ ✓ ✓
Optimal
Workshop
✓ ✓
Yahoo Groups,
Facebook,
LinkedIn
✓
Survey tools
(Getfeedback,
Qualtrics,
SurveyMonkey)
✓ ✓ ✓ ✓
Examples of types of tests available (Incomplete list)
63. Chrisitan Rohrer’s NNG article about when appropriatenes of a method to help answer specific
questions: http://www.nngroup.com/articles/which-ux-research-methods/
A review of usability and UX testing tools: http://www.smashingmagazine.
com/2011/10/20/comprehensive-review-usability-user-experience-testing-tools/
How to select an unmoderated user testing tool to fit your needs: http://www.nngroup.
com/articles/unmoderated-user-testing-tools/
List of tools for unmoderated testing:
1. http://remoteresear.ch/tools/
2. http://www.infragistics.com/community/blogs/ux/archive/2012/11/07/6-tools-for-remote-
unmoderated-usability-testing.aspx
Kyle Soucy’s article in UX Matters (Unmoderated, Remote Usability Testing: Good or Evil?) http:
//www.uxmatters.com/mt/archives/2010/01/unmoderated-remote-usability-testing-good-or-evil.php
Additional Links