This document provides an introduction to using survey methods. It discusses key aspects of the survey process including establishing goals, deciding who to ask and sample size, developing the questionnaire, and analyzing responses. The document emphasizes that the goal of a survey is to obtain numbers that can help inform decisions. It also stresses the importance of asking the right questions of the right people in order to obtain valid and representative results. Test questions are recommended to ensure they can be clearly understood and accurately answered. The full survey process involves multiple iterative steps to obtain useful insights.
➥🔝 7737669865 🔝▻ dharamshala Call-girls in Women Seeking Men 🔝dharamshala🔝 ...
Surveys that Work 2020: training course for HMRC user researchers 2020
1. Surveys that work
An introduction to
using survey methods
Caroline Jarrett and Jane Matthews
@cjforms and @janematthews
2020 #surveysthatwork
2. Caroline Jarrett @cjforms (CC) BY SA-4.0
Introductions
(We’re Caroline Jarrett and Jane Matthews)
• Your name and role
• A random thing about yourself
Image credit: Caroline Jarrett
2
4. Caroline Jarrett @cjforms (CC) BY SA-4.0
Fill in this questionnaire
1. How many surveys have you run?
NONE 1 to 5 6 to 10 more than 10
2. What is your top tip for a better survey, based on
experience of writing or answering?
__________________________________
__________________________________
Jarrett, C. and Bachmann, K (2002) Creating Effective User Surveys,
49th Society for Technical Communication Conference, Nashville TN USA
6. Caroline Jarrett @cjforms (CC) BY SA-4.0
Try this as an interview
1. How many surveys have you run?
NONE 1 to 5 6 to 10 more than 10
2. What is your top tip for a better survey, based on
experience of writing or answering?
__________________________________
__________________________________
Jarrett, C. and Bachmann, K (2002) Creating Effective User Surveys,
49th Society for Technical Communication Conference, Nashville TN USA
7. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
Goals
Establish
your goals
for the
survey
Questions
you need
answers to
Decide who
to ask and
how many
Build the
questionnaire
Run the
survey from
invitation to
follow-up
Clean the
data
Analyse and
present the
results
People you
will invite to
answer
Sample Questionnaire Fieldwork
People who
actually
answer
Responses Insights
Answers Decisions
Test the
questions
Questions
Questions
people can
answer
Questions
people can
interact with
7
9. Caroline Jarrett @cjforms
The survey is a
systematic method
for gathering information from
(a sample of) entities
for the purpose of
constructing quantitative descriptors
of the attributes of the larger population
of which the entities are members.
Groves, Robert M.; Fowler, Floyd J.; Couper, Mick P.; Lepkowski, James M.; Singer, Eleanor &
Tourangeau, Roger (2004).Survey methodology. Hoboken, NJ: John Wiley & Sons.
10. Caroline Jarrett @cjforms
The survey is a
process
for getting answers to questions from
(a sample of) entities
for the purpose of
constructing quantitative descriptors
of the attributes of the larger population
of which the entities are members.
11. Caroline Jarrett @cjforms
The survey is a
process
for getting answers to questions from
(a sample of) people
for the purpose of
constructing quantitative descriptors
of the attributes of the larger population
of which the entities are members.
12. Caroline Jarrett @cjforms
The survey is a
process
for getting answers to questions from
(a sample of) people
for the purpose of
getting numbers
of the attributes of the larger population
of which the entities are members.
13. Caroline Jarrett @cjforms
The survey is a
process
for getting answers to questions from
(a sample of) people
for the purpose of
getting numbers
that you can use to make decisions
Caroline Jarrett @cjforms (CC) BY SA-4.0
14. Caroline Jarrett @cjforms
The survey is a
process for getting
answers to questions
To make decisions People
getting numbers
Caroline Jarrett @cjforms (CC) BY SA-4.0
15. Caroline Jarrett @cjforms (CC) BY SA-4.0
The aim of a survey is to get a number
that helps you to make a decision
15
16. Is this a survey or something else?
• Review these questions
• Decide whether they are a survey or something else
16
17. Caroline Jarrett @cjforms (CC) BY SA-4.0
The aim of a survey is to get a number
that helps you to make a decision
17
18. Caroline Jarrett @cjforms (CC) BY SA-4.0
The aim of a survey is to get a number
that helps you to make a decision
Goals
Questions
Questionnaire
Sample
Fieldwork
Responses
Insights
18
22. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
Establish
your goals
for the
survey
Decide who
to ask and
how many
Build the
questionnaire
Run the
survey from
invitation to
follow-up
Clean the
data
Analyse and
present the
results
Questions
you need
answers to
People you
will invite to
answer
Goals Sample Questionnaire Fieldwork
People who
actually
answer
Responses Insights
Answers Decisions
Test the
questions
Questions
Questions
people can
answer
Questions
people can
interact with
22
24. Caroline Jarrett @cjforms (CC) BY SA-4.0
Goals
The survey process
Questions
you need
answers to
Establish
your goals
for the
survey
Goals
24
25. Caroline Jarrett @cjforms (CC) BY SA-4.0
Establish your goals for the survey
Goals
What do you want to know?
Why do you want to know?
What decisions will you make
based on these answers?
25
28. Caroline Jarrett @cjforms (CC) BY SA-4.0
Goals
1950s mindset: “Ask Everything”
Survey =
Big Honkin’ Survey
29. Caroline Jarrett @cjforms (CC) BY SA-4.0
Goals
2020 mindset: the Light Touch survey
• Choose ONE question
• Find ONE person
• Ask the question, face-to-face
• See if you can make ONE decision
• Improve, iterate, increase
29
32. Caroline Jarrett @cjforms (CC) BY SA-4.0
Goals
What’s the Most Crucial Question?
• We want to ask the fewest questions that will help us to
make the decision so we need to know which is are the
most useful questions
• Even better: know the specific Most Crucial Question
• A Most Crucial Question has a numeric answer
32
33. Caroline Jarrett @cjforms (CC) BY SA-4.0
Goals
What’s the Most Crucial Question?
Look through the questions in this survey
What is the Most Crucial Question?
34. Caroline Jarrett @cjforms
Goals
Another way:
narrow downLots of questions
Useful questions
Most Crucial Question
Caroline Jarrett @cjforms (CC) BY SA-4.0
34
35. Caroline Jarrett @cjforms (CC) BY SA-4.0
Goals
Talk to users about
the topics in your survey
• Who are they?
• How will you find them?
• Do they want to answer your questions?
• Do they understand your questions?
35
36. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
Establish
your goals
for the
survey
Questions
you need
answers to
Goals
Decide who
to ask and
how many
People you
will invite to
answer
Sample
36
37. Caroline Jarrett @cjforms (CC) BY SA-4.0
Sample
Asking the right people is better
than asking lots of people Sample:
the list you
sample from
Caroline Jarrett @cjforms (CC) BY SA-4.0
37
38. Caroline Jarrett @cjforms (CC) BY SA-4.0
Sample
Choose a good list
Coverage error:
Mismatch between the people you
want to ask and the list you
choose to sample from
Caroline Jarrett
@cjforms
(CC) BY SA-4.0
40. Caroline Jarrett @cjforms (CC) BY SA-4.0
Sample
Difference between response,
response rate and representativeness
Concept Definition Example
Response Number of answers 5,000
Response rate Response divided by
the number of invitations
10%
Representativeness Whether respondents
you get are typical of
the users you want
Image credit: North Korean flag, http://commons.wikimedia.org/wiki/File:Flag_of_North_Korea.svg
40
41. Sample
Did we get answers from
the right people?
Is this sample representative?
Image credit: Caroline Jarrett / CorelDraw
41
42. Sample
Check the representativeness
of your sample
Population of assorted birds
Is this sample representative?
Image credit: Caroline Jarrett / CorelDraw
42
44. Caroline Jarrett @cjforms (CC) BY SA-4.0
Sample
Decide how to target
the correct people
• Iterate down from a list
– Public list
– Private list
• Try a ‘snowball’
– Send and hope
– Use contacts
• Catch them
in the moment
Image credit: Flickr sunchild57
44
45. Caroline Jarrett @cjforms (CC) BY SA-4.0
Sample
Non-response error
is the one that hurts
Non-response error:
The ones who answer differ from
the ones who don’t answer in a
way that affects the survey statistic
Caroline Jarrett
@cjforms
(CC) BY SA-4.0
47. Caroline Jarrett @cjforms (CC) BY SA-4.0
Sample
Response depends on
effort, reward and trust
People will only respond if they trust
you. After that, it's a balance between
the perceived reward from filling in the
survey compared to the perceived
effort that's required. Strangely
enough, if a reward seems 'too good to
be true' that can also reduce the
response.
Diagram from Jarrett, C, and Gaffney, G (2008)
“Forms that work: Designing web forms for usability”
inspired by Dillman, D.A. (2000)
“Internet, Mail and Mixed Mode Surveys: The Tailored Design Method”
47
55. Caroline Jarrett @cjforms (CC) BY SA-4.0
Sample
Overcome the ‘Zone of Indifference’
by asking about the Burning Issues
55
56. Caroline Jarrett @cjforms (CC) BY SA-4.0
Sample
There is always
sampling error
Sampling error:
Ask a sample instead of
asking everyone
Caroline Jarrett
@cjforms
(CC) BY SA-4.0
57. Caroline Jarrett @cjforms (CC) BY SA-4.0
Sample
If you get the other decisions right,
you can calculate a confidence interval
https://www.omnicalculator.com/statistics/confidence-interval
57
60. Caroline Jarrett @cjforms (CC) BY SA-4.0
A survey is only valid if the questions
match the reason you’re doing it
Lack of validity:
mismatch between what you ask and
what you need to know
Caroline Jarrett @cjforms (CC) BY SA-4.0
60
61. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
Establish
your goals
for the
survey
Decide who
to ask and
how many
Questions
you need
answers to
People you
will invite to
answer
Goals Sample
Test the
questions
Questions
Questions
people can
answer
61
63. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questions
Helps a lot if you ask good questions
Questions:
What are you
asking about?
How many
questions?
Caroline Jarrett @cjforms (CC) BY SA-4.0
63
64. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questions
There are four steps to
answer a question
Understand
Find
Decide
Place
Adapted from Tourangeau, R., Rips, L. J. and Rasinski, K. A. (2000)
“The psychology of survey response”
65. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questions
There are four steps
to answer a question
Step A good question …
1. Read and understand is legible and makes sense
2. Find an answer asks for answers that we know
3. Decide on the answer asks for answers we’re happy to reveal
4. Place the answer offers appropriate spaces for the answers
Adapted from Tourangeau, R., Rips, L. J. and Rasinski, K. A. (2000)
“The psychology of survey response”
67. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questions
Four step examples:
1: read and understand
Hermann grid illusion
67
68. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questions
Four step examples:
2: find the answer
In your last five days at work, what
percentage of your work time do you
estimate that you spend using publicly-
available online services (not including
email, instant messaging and search) to
do your work using a work computer or
other device? 68
73. Questions
Make suggestions about
draft questions, using the four steps
• We have some early draft questions
• Think about the four steps of answering a question:
1. Read and understand the question
2. Find the answer (including the Approximate Curve of Forgetting)
3. Decide whether the answer fits
4. Place the answer
• Any suggestions for the next draft?
73
74. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
74
Establish
your goals
for the
survey
Decide who
to ask and
how many
Build the
questionnaire
Questions
you need
answers to
People you
will invite to
answer
Goals Sample Questionnaire
Test the
questions
Questions
Questions
people can
answer
Questions
people can
interact with
75. Caroline Jarrett @cjforms (CC) BY SA-4.0
A good question gets good answers
Measurement error:
Mismatches between
the questions you ask and
the answers that people give you
Caroline Jarrett
@cjforms
(CC) BY SA-4.0
77. Caroline Jarrett @cjforms
Questionnaire
"Phone photography" by Petar Milošević -
Own work. Licensed under CC BY-SA 3.0 via
Commons -
https://commons.wikimedia.org/wiki/File:Pho
ne_photography.jpg#/media/File:Phone_phot
ography.jpg
Modified by Caroline Jarrett
77
79. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questionnaire
“Place the answer” is also about
using the right widget to collect the answer
Use For
Radio buttons A single known answer
Check boxes Multiple known answers
Text boxes Unknown answers
79
80. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questionnaire
Likert had several types
of response format in his scales
Likert, Rensis. (1932). A Technique for the Measurement of Attitudes.
Archives of Psychology, 140, 1–55.
80
81. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questionnaire
You can find an academic paper to
support almost any number of response points
Krosnick, J. A. and S. Presser (2009). Question and Questionnaire Design.
Handbook of Survey Research (2nd Edition) J. D. Wright and P. V. Marsden, Elsevier.
http://bit.ly/KNWlio
81
82. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questionnaire
Grids are often full of problems
at all four steps
82
83. Questionnaire
An example
Here’s an example of a questionnaire Likert scale
• Which statements would you definitely keep?
• Which statements would you amend/edit?
• Any statements you would prefer to remove?
• Any formatting thoughts before we turn it
into a questionnaire?
83
84. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questionnaire
Grids are a major cause of
survey drop-out
35%
20%
20%
15%
5%
5%
Total incompletes across the 'main' section of the questionnaire
(after the introduction stage)
Subject Matter
Media Downloads
Survey Length
Large Grids
Open Questions
Other
Source: Database of 3 million+ web surveys conducted by Lightspeed Research/Kantar
From Coombe, R., Jarrett, C. and Johnson, A. (2010) “Usability testing of market research surveys” ESRA Lausanne
84
85. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questionnaire
But it’s the topic that matters most
35%
20%
20%
15%
5%
5%
Total incompletes across the 'main' section of the questionnaire
(after the introduction stage)
Subject Matter
Media Downloads
Survey Length
Large Grids
Open Questions
Other
Source: Database of 3 million+ web surveys conducted by Lightspeed Research/Kantar
From Coombe, R., Jarrett, C. and Johnson, A. (2010) “Usability testing of market research surveys” ESRA Lausanne
85
86. Questionnaire
Tip Test your questions by
interviewing in context
Caroline Jarrett @cjforms (CC) BY SA-4.0
86
87. Caroline Jarrett @cjforms
Your answers to this survey
are important for our work
But what’s in it for
me? And I’m really
ready for a break.
87
89. Goals
Sample
Goals and sample for the survey
We’ve had a request for help with a survey.
We’ll be having a meeting to discuss it.
• Decide on the topics you’ll want to discuss at the meeting
• Also, prepare a suggestion for the Most Crucial Question
89
90. Questions
Write questions
• Decide on the MCQ that you will ask
– Check that users can:
• Read and understand it
• Find the answer
• Decide on the answer
• Decide if you need any extra questions
to frame the MCQ
• Is there a Burning Issue?
90
92. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
Establish
your goals
for the
survey
Decide who
to ask and
how many
Build the
questionnaire
Run the
survey from
invitation to
follow-up
Clean the
data
Analyse and
present the
results
Questions
you need
answers to
People you
will invite to
answer
Goals Sample Questionnaire Fieldwork
People who
actually
answer
Responses Insights
Answers Decisions
Test the
questions
Questions
Questions
people can
answer
Questions
people can
interact with
92
93. Caroline Jarrett @cjforms (CC) BY SA-4.0
Run the
survey from
invitation to
follow-up
Fieldwork
People who
actually
answer
The survey process
93
96. Caroline Jarrett @cjforms (CC) BY SA-4.0
Fieldwork
Recap: Response relies on
effort, reward, and trust
People will only respond if they trust
you. After that, it's a balance between
the perceived reward from filling in the
survey compared to the perceived
effort that's required. Strangely
enough, if a reward seems 'too good to
be true' that can also reduce the
response.
Diagram from Jarrett, C, and Gaffney, G (2008) “Forms that work: Designing web forms for usability”
inspired by Dillman, D.A. (2000) “Internet, Mail and Mixed Mode Surveys: The Tailored Design Method”
96
97. Caroline Jarrett @cjforms (CC) BY SA-4.0
Fieldwork
The elements of a good invitation
• Trust:
– Say who you are
– Say why you’ve contacted this person
specifically
• Perceived reward:
– Explain the purpose of the survey
– Explain why this person’s responses
will help that purpose
– If there is an incentive, offer it
• Perceived effort:
– Outline the topic of the survey
– Say when the survey will close
– Do NOT say how long it will take
• (unless you have tested the heck out of it and
are extremely sure that you know the answer)
97
98. Fieldwork
Write the invitation
and thank-you
• Hints:
– the invitation can be part of the questionnaire
– thank-you is on a separate page
98
99. Caroline Jarrett @cjforms (CC) BY SA-4.0
Fieldwork
Test it: pilot study
• Run the survey from invitation to the follow-up
• Look for mechanical problems like wrong link in the
invitation, no thank-you page
• Find out what your response rate is
so that you can work out your sample size
“If you don’t have time to do
a pilot study, you don’t have
time to do the survey” 99
101. Caroline Jarrett @cjforms (CC) BY SA-4.0
Fieldwork
Think about the test and iterate
• Are the people you tested with representative?
• Did you test the whole survey
– From invitation to follow up?
– Including the analysis of responses?
– Including finding out whether you can make the decision?
• What do you need to change for the next version?
101
103. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
Establish
your goals
for the
survey
Decide who
to ask and
how many
Build the
questionnaire
Run the
survey from
invitation to
follow-up
Clean the
data
Analyse and
present the
results
Questions
you need
answers to
People you
will invite to
answer
Goals Sample Questionnaire Fieldwork
People who
actually
answer
Responses Insights
Answers Decisions
Test the
questions
Questions
Questions
people can
answer
Questions
people can
interact with
103
105. Caroline Jarrett @cjforms (CC) BY SA-4.0
The answers that you get will tell you
whether you had good questions
Caroline Jarrett
@cjforms
(CC) BY SA-4.0
Measurement error:
Mismatches between the
questions you ask and the
answers people actually give you
107. Caroline Jarrett @cjforms (CC) BY SA-4.0
Responses
Clean your data
• Look for gaps and missing entries
• Remove any (unintended) duplicate responses
• Read the answers to make sure that
they make sense compared to the questions
Image credit: Shutterstock
Adapted from Boslaugh, S. and P. A. Watters (2008)
Statistics in a nutshell O’Reilly107
108. Responses
Have a go at data cleaning
Here are some answers from a survey
• What adjustments could we make to this question to
make the next survey better?
108
109. Caroline Jarrett @cjforms (CC) BY SA-4.0
Responses
Decide whose answers to include
Caroline Jarrett
@cjforms
(CC) BY SA-4.0
Adjustment error:
Problems when deciding whether
to include or exclude someone’s
answers
110. Caroline Jarrett @cjforms (CC) BY SA-4.0
Responses
Look after your data
• Data analysis can take a long time;
you won’t want to repeat it
– Make copies of your data, especially before any drastic change
– ‘Undo’ doesn’t always work on large files
• Make notes of what you did
– It helps if you have to defend your conclusions
– It’s hard to remember
the details a year later
Image credit: Shutterstock
110
111. Caroline Jarrett @cjforms (CC) BY SA-4.0
Responses
Decide what to do when people
have skipped questions or dropped out
1. Remove the whole of that person’s response
2. Use the partial responses, and accept that your number
of responses is lower for some questions
3. Calculate an “imputed value”
– Include a flag showing that the value is calculated
– Estimate the most likely value using the other data
111
112. Caroline Jarrett @cjforms (CC) BY SA-4.0
Responses
If you’re losing people,
have you still got representativeness?
Image credit: Caroline Jarrett / CorelDraw112
113. Caroline Jarrett @cjforms (CC) BY SA-4.0
Responses
You can interpret data well – or poorly
Caroline Jarrett
@cjforms
(CC) BY SA-4.0
Processing error:
Bad choices about how to interpret
the answers
114. Caroline Jarrett @cjforms (CC) BY SA-4.0
Responses
Typing in the answers = coding
Image credit: https://www.census.gov/history/www/census_then_now/notable_alumni/herman_hollerith.html
114
115. Caroline Jarrett @cjforms (CC) BY SA-4.0
Responses
If you ask for answers,
you have to read and think about them
115
116. Responses
Have a go at coding
Here are some answers from a survey
• Are there any themes?
• How would you code them?
116
117. Caroline Jarrett @cjforms (CC) BY SA-4.0
Responses
CAQDAS tools are available
(but are a big challenge)
Before buying one, read this site:
http://www.surrey.ac.uk/sociology/research/
researchcentres/caqdas/support/choosing/index.htm
http://bit.ly/Surrey1234
Image credit: http://www.surrey.ac.uk/sociology/research/researchcentres/caqdas/support/choosing/index.htm
117
118. Caroline Jarrett @cjforms (CC) BY SA-4.0
Responses
Wordle from a survey
on usability certification
118
121. Responses
Here’s an example of results
• Does the pie chart / bar chart presentation work for you?
• Does the tag cloud help or not?
121
122. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
Establish
your goals
for the
survey
Decide who
to ask and
how many
Build the
questionnaire
Run the
survey from
invitation to
follow-up
Clean the
data
Analyse and
present the
results
Questions
you need
answers to
People you
will invite to
answer
Goals Sample Questionnaire Fieldwork
People who
actually
answer
Responses Insights
Answers Decisions
Test the
questions
Questions
Questions
people can
answer
Questions
people can
interact with
122
123. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
Analyse and
present the
results
Insights
Decisions
123
126. Caroline Jarrett @cjforms (CC) BY SA-4.0
Standard deviations (SD) show spread
By JRBrown - Own work, Public Domain,
https://commons.wikimedia.org/w/index.php?curid=10777712
126
127. Caroline Jarrett @cjforms (CC) BY SA-4.0
Correlations show relationships
+1.0 closely aligned
+0.8 somewhat aligned
+0.4 a bit aligned
+0.0 no relationship
-0.4 one goes up, the other one goes down a bit
-0.8 one goes up, the other one mostly goes down
-1.0 one goes up, the other goes down the same amount
https://commons.wikimedia.org/wiki/File:Correlation_examples2.svg
128. Caroline Jarrett @cjforms (CC) BY SA-4.0
Correlations do not show slope
https://commons.wikimedia.org/wiki/File:Correlation_examples2.svg
128
129. Caroline Jarrett @cjforms (CC) BY SA-4.0
Some relationships are not correlated
https://commons.wikimedia.org/wiki/File:Correlation_examples2.svg
129
131. Caroline Jarrett @cjforms (CC) BY SA-4.0
Insights
Actual values for Justin’s dataset
• X Mean: 54.26
• Y Mean: 47.83
• X SD: 16.76
• Y SD: 26.93
• Corr.: -0.06
https://twitter.com/JustinMatejka/status/770682771656368128
131
132. Caroline Jarrett @cjforms (CC) BY SA-4.0
Insights
Use graphs and charts to
understand relationships in the data
Anscombe, F. J.. (1973). Graphs in Statistical Analysis. The American Statistician, 27(1), 17–21. http://doi.org/10.2307/2682899
132
135. Caroline Jarrett @cjforms (CC) BY SA-4.0
Insights
Use descriptive statistics
to explore numerical data
• Most seen for statistics
– Mean (arithmetic average)
– Standard deviation (spread of answers)
– Correlation (whether answers vary together)
• Useful for thinking about the data
– Range (lowest to highest)
– Mode (most common answer)
135
136. Insights
Dealing with rating questions
A ‘Like / Dislike’ question got these responses
Strongly dislike 2
Dislike 6
Neither dislike nor like 14
Like 31
Strongly like 13
Total responses 66
Please work out:
the percentage of respondents who ‘like’
136
137. Caroline Jarrett @cjforms (CC) BY SA-4.0
Insights
There are many ways to combine
ratings into means and percentages
• 47% 31 ticked ‘like’ so 31/66 = 47%
• 67% ‘Top box’ / ‘top 2 box’ uses the positive responses
• 68% ‘0 to 4’ weights responses: 0%, 25%, 50%, 75%, 100%
• 74% ‘1 to 5’ weights responses: 1, 2, 3, 4, 5 (then divide by 5)
• 36% ‘-1 to 1’ weights responses: -100%, -50%, 0, 50%, 100%
67% 68% 74% 36%
137
138. This example has a graph
• This example uses the calculation:
Poor = 1
Reasonable = 2
Good = 3
Excellent = 4
• Is the graph an appropriate illustration of the data?
138
139. Caroline Jarrett @cjforms (CC) BY SA-4.0
Insights
Net Promoter Score™
has a special analysis method
Image credit: https://www.netpromoter.com/know/139
140. Caroline Jarrett @cjforms
Asking
the right
question
Choose whichever
method you like,
but you must
make the choice
when you decide on
the goals of the survey
140
141. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
Establish
your goals
for the
survey
Decide who
to ask and
how many
Build the
questionnaire
Run the
survey from
invitation to
follow-up
Clean the
data
Analyse and
present the
results
Questions
you need
answers to
People you
will invite to
answer
Goals Sample Questionnaire Fieldwork
People who
actually
answer
Responses Insights
Answers Decisions
Test the
questions
Questions
Questions
people can
answer
Questions
people can
interact with
141
142. Caroline Jarrett @cjforms (CC) BY SA-4.0
All the topics are connected
Goals
Sample
Fieldwork
Response
Insight
Response
Questionnaire
Questions
Caroline Jarrett
@cjforms
(CC) BY SA-4.0 142
143. Caroline Jarrett @cjforms (CC) BY SA-4.0
The aim is to get the best number you
can, within the resources you have
What you want to ask about
The reason you’re doing it
The questions you ask
The answers you get
The answers you use
Who you want to ask
The list that you sample from
The sample you ask
The ones who answer
The ones whose answers
you can use
The number Caroline Jarrett @cjforms (CC) BY SA-4.0
143
144. Caroline Jarrett @cjforms (CC) BY SA-4.0
The aim is to get the best number you
can, within the resources you have
What you want to ask about
The reason you’re doing it
The questions you ask
The answers you get
The answers you use
Who you want to ask
The list that you sample from
The sample you ask
The ones who answer
The ones whose answers
you can use
The number Caroline Jarrett @cjforms (CC) BY SA-4.0
144
145. Caroline Jarrett @cjforms
Total Survey Error diagram as presented in
Groves, R. M., F. J. Fowler, M. P. Couper, J. M.
Lepkowski, E. Singer and R. Tourangeau (2009).
Survey methodology. Hoboken, N.J., Wiley.
145
147. Caroline Jarrett @cjforms (CC) BY SA-4.0
Should I do this survey?
Yes
Yes
Yes
Yes
GoYes
Do I know how I’m going to use the answers?
Do people want to respond to my request?
Do people have answers to these questions?
Do I have time to test and to iterate?
Is a survey the right way to get the answers?
147
The survey sits between 'what you want to ask', 'who you want to ask' and 'the number'
The survey sits between 'what you want to ask', 'who you want to ask' and 'the number'
The survey sits between 'what you want to ask', 'who you want to ask' and 'the number'
Screenshot of the Suttons Seeds website with a pop-up box: "Help us improve. We value your opinion. What do you like about our site and what can we improve on?"
A process starting with one person face to face, continues through 10 people by phone, gets to 100 people by email or pop-up.
It’s best to check that your question works with one person before you hassle 10 people with it. Then check it works with 10 people before you send it to 100. Once you’ve tried it on 100 people, you might be more interested in a new question than getting more answers on this question
Another useful iteration: start with whatever questions your stakeholders have, then narrow down to useful questions, then narrow again to your MCQ
The octopus, with focus on 'The list you sample from'
Prank leaves Justin Bieber facing tour of North Korea
By Daniel Emery Technology reporter, BBC News
5 July 2010
Image caption It is highly unlikely Bieber would be given permission to enter North Korea Canadian singer Justin Bieber's has become the target of a viral campaign to send him to North Korea.
A website polled users as to which country he should tour next, with no restrictions on the nations that could be voted on.
There are now almost half a million votes to send the singer to the secretive communist nation.
The contest, which ends at 0600 on 7 July, saw North Korea move from 24th to 1st place in less than two days.
Many of the votes are thought to originate from imageboard website 4chan, which has built a reputation for triggering online viral campaigns.
A classic example of the difference between response and representativeness: a Justin Bieber fan site organised a poll to see where the teen star should have his next concert. The poll got a big response but the winning location was North Korea. It seems unlikely that the respondents were representative of true Bieber fans.
The picture reflects the mistakes we can make if we do our sampling based solely on the judgement of an interviewer
People will only respond if they trust you. After that, it's a balance between the perceived reward from filling in the survey compared to the perceived effort that's required. Strangely enough, if a reward seems 'too good to be true' that can also reduce the response.
This is a genuine invitation from local government, but the layout and images in the invitation make it look as if it's an approach from some sort of spammer or scammer.
The octopus again. This time we're looking at 'the questions we ask'.
The octopus again. This time we're looking at 'the questions we ask'.
The four steps are:
Read and understand the question
Find the answer
Judge whether the answer you’ve found aligns with what you say and the way you want to portray yourself
Place the answer on the form (or give it to the interviewer)
Two questions from a survey:
'24: Do you use a Windows or Mac computer'?
'25. What is your gender'?
Photo of a Samsung (android) mobile with the same questions as previous slide. If you only have an Android mobile, how do you answer ‘do you use a Windows or Mac computer’ when the answer options are ‘Windows’, ‘Mac’ and ‘Both’?
A model wears a t-shirt with Gender: 'Male' (crossed out), 'Female' (crossed out) and 'Other' (added and ticked'.
Question from Likert’s 1932 paper.
13. How much military training should we have?
We need universal compulsory military training (1)
We need Citizens Military Training Camps and Reserve Officers Training Corps, but not universal military training (2)
We need some facilities for training reserve officers but not as much as at present (3)
We need only such military training as is required to maintain our regular army (4)
All military training should be abolished (5)
Another question, this time using what is often called a ‘Likert Scale’ but should really be called a ‘Likert response format’
17. The United States, whether a member or not, should co-operate fully in the humanitarian and economic programs of the League of Nations.
Strongly approve (5)
Approve (4)
Undecided (3)
Disapprove (2)
Strongly disapprove (1)
Krosnick and Presser refer to ~87 papers on response points.
This selection of questions from different surveys has:
One with seven response points in the range
One with two response points (yes/no)
One with five response points plus ‘not applicable’
One with three response points
One with four response points plus a comment box
One with four response points on their own
One with 10 response points plus ‘Don’t Know’
Ratings screens examples
People will only respond if they trust you. After that, it's a balance between the perceived reward from filling in the survey compared to the perceived effort that's required. Strangely enough, if a reward seems 'too good to be true' that can also reduce the response.
A bar chart, answers to question "Are you a parent or guardian of a child in any of the following age bands (please tick all that apply)?
There are five bands of two years each starting with 'Aged 2 or under' and going evenly up to 'Aged 13-15'. These all have roughly similar numbers of respondents - around 15%.
There's one massive bar for the category 'Aged 16+', with around 45% response.
The octopus again; we've looked at 6 of the 8 tentacles.
This is a more conventional way of looking at the octopus tentacles
This is a more conventional way of looking at the octopus tentacles
Is this the only way I can get this data?
If yes:
Do users want to talk to us about these topics?
If yes:
Do users have answers to these questions?
If yes:
Do I have time to do a pilot?
If yes:
Do I know how I’m going to use the answers?
GO