Carine Lallemand - How Relevant is an Expert Evaluation of UX based on a Psychological Needs-Driven Approach? Paper presented at the 8th Nordic Conference on Human-Computer Interaction NORDICHI’14.
Tata AIG General Insurance Company - Insurer Innovation Award 2024
NORDICHI'14 - Carine Lallemand - How Relevant is an Expert Evaluation of UX based on a Psychological Needs-Driven Approach?
1. How Relevant is an Expert Evaluation of UX based
on a Psychological Needs-Driven Approach?
Carine Lallemand - CRP Henri Tudor / University of Luxembourg
Vincent Koenig - University of Luxembourg
Guillaume Gronier - Public Research Centre Henri Tudor, Luxembourg
8th Nordic Conference on Human-Computer Interaction NORDICHI’14
2. Expert evaluation
methods in HCI
Inspection methods: inspection of
the interface by an evaluator
Relies solely on the expertise and
judgment of the evaluator
Developed in the 1990s as discount
usability engineering methods
Cheap, fast, easy to use
Most common: heuristic evaluation
Some limitations: low validity and
limited reliability
3. Beyond usability:
UX expert evaluation
Very few methods described as purely
expert-based (Vermeeren et al., 2010)
!
Some heuristic sets focused on UX:
• UX heuristics for Web 2.0 services
(Väänänen-Vainio-Mattila & Wäljas, 2009)
• Ten Heuristic for Optimal UX
(Colombo & Pasch, 2010)
• Ten UX Heuristics
(Arhippainen, 2013)
Source: http://allaboutux.org
4. How relevant is an expert evaluation of UX?
Many reasons why expert-based evaluations are not commonly explored
in UX research
Nevertheless, purely expert-based evaluation is frequently used by UX
practitioners for UX evaluation
Research questions:
(1) Are experts actually able to conduct a UX expert evaluation?
(2) How do they proceed? On which elements is their assessment based?
(3) How good is their evaluation? Does it reflect the real experience from
the end-users?
5. Human needs as triggers for positive UX
« Good UX is the consequence of fulfilling
the human needs for autonomy,
competency, stimulation, relatedness,
and popularity through interacting with
the product or service (…) »
(Hassenzahl, 2008)
Title
Definition
Real life
examples
Pictures
Keywords
Seven needs … represented on … 7 UX Cards
Relatedness / Belongingness
Competence / Effectiveness
Autonomy / Independence
Security / Control
Pleasure / Stimulation
Influence / Popularity
Self-Actualizing / Meaning
Sheldon et al., 2001; Hassenzahl et al., 2010
7. Methodology - UX expert evaluation
33 UX experts assessed 4 systems: Amazon, Facebook, Angry Birds, Olympus digital camera
Familiarization
with the cards
Identification of elements
using the cards
Overall assessment
of UX needs fulfillment
Using the UX Cards, experts were asked
to identify elements impacting - positively
or negatively - one of the 7 needs
For each system, we then asked the
experts to provide an overall UX
assessment of the system
9. Methodology - User Testing
70 users in a usability lab assessed 2 interactive systems: Amazon and an Olympus digital camera
Free exploration
and scenarios
Questionnaires
AttrakDiff and UX needs
Interview
10. Expert evaluation User Testing
4 assessed systems 2 assessed systems
Free exploration and scenariosFree exploration and assessment during 15
minutes / system
33 UX experts 70 users
UX Cards
Global evaluation of UX needs
AttrakDiff scale
UX needs questionnaire
Observation and interviewObservation and interview
11. Expert Evaluation: participants
33 UX experts
(16 women et 17 men)
Mean age = 31 years
(min 23, max 43, SD = 5.96)
Level of expertise with expert
evaluation
(self-assessed on a 7-points scale)
M= 5.24 (SD=1.39)
12. Research Question 1:
Are experts actually able to conduct a UX expert evaluation?
Total of 1794 identified elements…
54 elements/expert and 14 elements/system on average
No differences according to the assessed system
!
…linked to a total of 3455 needs
2277 needs cited as positive and 1179 as negative
!
Experts encountered no blocking issues in
conducting the UX expert evaluation
posi%ve((
66%(
nega%ve(
34%(
Total(number(of(cited(needs(
posi%ve(vs.(nega%ve(
Major difference with usability
evaluation where one generally
reports 25% positive elements
and 75% issues
14. Research Question 2:
How do they proceed? On which elements is their assessment based?
Most cited needs: pleasure, security
Least cited needs: influence, self-actualizing
Main type of identified elements: concept/
content, features, design and usability
Relatedness)
11%)
Security)
22%)
Pleasure)
23%)
Influence)
8%)
Competenc
e)
17%)
Autonomy)
13%)
Self?Actu)
6%)
Total)number)of)cited)needs)
0" 100" 200" 300" 400" 500" 600" 700"
Marke/ng"/"Brand"
Concept"/"Content"
Design"
Usability"
Features"
Interoperability"
Service"experience"
Adverts"
Type of identified elements
15. Research Question 2:
How do they proceed? On which elements is their assessment based?
Effect of familiarity with the system on UX assessment
The more an expert is familiar with a system, the more he is likely to assess
the system as positive
(significant correlations from .37 for Angry Bird to .45 for Amazon or .46 for Facebook)
Link between identified elements and overall assessment (Likert scale)
In all cases, overall UX assessment is positively correlated to the number of
positive needs cited and negatively correlated to the one of negative needs:
Suggests that the overall assessment is based on the evaluation task?
However this is not always true at the need level:
The overall assessment of needs might sometimes be based on other factors
than the number of elements and needs identified
16. Research Question 3:
How good is their expert evaluation?
Experts - Top-3 UX Needs
(number of citations)
Security
Pleasure
Competence
!
User Tests - Top-3 UX Needs
(5-points scale)
Security (3.44)
Autonomy (3.34)
Competence (3.2)
9%#
26%#
22%#
6%#
21%#
11%#
5%#
Camera&'&needs&cited&as&
posi0ve&by&experts&
Relatedness#
Security#
Pleasure#
Influence#
Competence#
Autonomy#
Self?Actualizing#
17. Research Question 3:
How good is their expert evaluation?
Experts - Top-3 UX Needs
(number of citations)
Competence
Pleasure
Security
!
User Tests - Top-3 UX Needs
(5-points scale)
Security (3.93)
Competence (3.62)
Autonomy (3.62)
11%#
20%#
21%#
8%#
21%#
15%#
4%#
Amazon'('Needs'cited'as'
posi1ve'by'experts'
Relatedness#
Security#
Pleasure#
Influence#
Competence#
Autonomy#
Self@Actualizing#
18. Research Question 3:
How good is their expert evaluation?
Comparison experts vs. users
2,67%
3,44%
2,71%
2,04%3,2%
3,34%
2,51%
0%
1%
2%
3%
4%
5%
Relatedness%
Security%
Pleasure%
Influence%Competence%
Autonomy%
SelfBActu%
Sa#sfac#on)besoins)Users)/)UX)Needs)
3,78%
3,11%
3,66%
3,16%3,38%
3,51%
3,33%
0%
1%
2%
3%
4%
5%
Relatedness%
Security%
Pleasure%
Influence%Competence%
Autonomy%
SelfCActu%
Profilage)besoins).)Experts)
Overall evaluation - experts Evaluation - users
3,42%
3,48%
3,78%
3,38%3,98%
3,53%
2,81%
0,00%
1,00%
2,00%
3,00%
4,00%
5,00%
Relatedness%
Security%
Pleasure%
Influence%Competence%
Autonomy%
SelfCActu%
Profilage)besoins).)Experts)
2,12$
3,93$
2,81$
2,38$3,62$
3,62$
2,62$
0$
1$
2$
3$
4$
5$
Relatedness$
Security$
Pleasure$
Influence$Competence$
Autonomy$
SelfCActu$
Sa#sfac#on)besoins)Users)/)UX)Needs)
Digital Camera
Experts overestimated
the fulfillment of all needs,
except for Security
Amazon
Experts overestimated the
fulfillment of all needs,
except for Security and
Autonomy
19. Summary of main results
• UX experts encountered no blocking issues in conducting an UX
expert evaluation
• Experts tend to link elements to positive needs rather than to
negative ones
• There is an impact of familiarity with a system on the UX expert
evaluation: an objectivity issue?
• Using the UX cards, experts managed to identify the prominent
needs fulfilled by the system
• Using the Likert scale, experts overestimated the fulfillment of all
needs, except Security and Autonomy
20. Limitations and future work
Difficulty to evaluate UX and to compare the results of UX evaluation
Are some experts better than others at evaluating UX? Some profile
factors impacted the results
Improvements of the process to ensure a better quality of UX
evaluation:
• UX expert evaluation should be combined with the use of scenarios or personas, to
make it more easy for the expert to adopt the user perspective
• guided process in order to help evaluators thinking about all kind of elements impacting
UX (including the absence of an element)
Purely expert-based UX evaluation should not replace other methods
involving users!
21. Thank you for your attention!
Comments?
Questions?
Suggestions?