As we move into a new AI-enabled world, the role and decisions of User Experience designers become increasingly important. User Experience Design is at a crossroads with emerging technology, IT business models, and societal trends and needs. Human-centered Design and user advocacy are being challenged by algorithm-driven and manipulative interfaces (think social media) as well as business, and technology models that clash with user control and civil rights. New strategies are required for us to reduce bad, unethical, and embarrassing user experiences.
In this webinar, we will explore what causes ethical issues in UX, how to turn this around, and start practicing 'UX with Ethics' in our day-to-day work. This webinar aims to provide practical understanding, tips, and techniques to help you educate your team (a copy of the PPT will be made available) as well as to take a more influential role in your organization or design decision making role.
Recording of webinar given Jan 2021: https://youtu.be/EJcF75fOBYU
Pooja 9892124323, Call girls Services and Mumbai Escort Service Near Hotel Hy...
Ethics for ux crash course
1. 1
Ethics for UX Design
crash course
With Frank Spillers
CXO
Experience Dynamics
OPEN SOURCE: FEEL FREE TO SHARE AND PRESENT TO YOUR TEAM
2. 2
UX & Ethics: what to tell your team1
2 How UX Designers can influence Ethics
3 New Habits: What Teams need to do for better
Ethics UX
Ethics for UX Design crash course
AGENDA:
As we move into a new AI-enabled world, the role and
decisions of User Experience designers become
increasingly important. User Experience Design is at a
crossroads with emerging technology, IT business
models, and societal trends and needs. Human-
centered Design and user advocacy are being
challenged by algorithm-driven and manipulative
interfaces (think social media) as well as business, and
technology models that clash with user control and civil
rights. New strategies are required for us to reduce bad,
unethical, and embarrassing user experiences.
In this webinar, we will explore what causes ethical
issues in UX, how to turn this around, and start
practicing 'UX with Ethics' in our day-to-day work. This
webinar aims to provide practical understanding, tips,
and techniques to help you educate your team (a copy
of the PPT will be made available) as well as to take a
more influential role in your organization or design
decision making role.
4 Q&A
15. 15
In 2018, Twitter CEO Jack Dorsey tweeted:
“We have witnessed abuse, harassment, troll armies,
manipulation through bots and human-coordination,
misinformation campaigns, and increasingly divisive echo
chambers. We aren’t proud of how people have taken advantage
of our service, or our inability to address it fast enough.”
17. 17
Usually motivated
by:
An aggressive Growth Strategy
Sales & Marketing strategy
Attempt to grow Market Share
Product Manager doing their job
Maximizing out “Engagement”
18. 18
Aspiration vs Reality
Justin Rosenstein, who led the team that built the Facebook like button,
says the team was motivated by a desire to “spread love and positivity in
the world.”
People within the technology industry have leaned into such techno-
utopianism for decades, underplaying how maximizing profits is a major
motivating factor for them.
https://www.metamute.org/editorial/articles/californian-ideology
19. 19
Keep Scrolling…
How it’s displayed
How it behaves (Social Media hacks
behavior inducing accidental
extended time on site/app)
Left to right
20. 20
1
Quiz
How would you approach
a Notifications
experience knowing
that Notifications are
a) addictive;
b) ‘noise’,
c) overwhelming?
21. 21
Technology in general, and AI in particular, is not value-
neutral. The design decisions that are taken while
developing AI impart certain values to AI whether we want
them to or not. The key is to bridge the gap between high-
level principles and design of AI systems.
Steven Umbrello, Managing director for the Institute for Ethics and Emerging Technology
24. 24
• “I am right, you (users) are wrong”
• Familiarity- I know this to be true so it must be
• Comforts: mine are everyone’s
• Rules- I know them and if you haven’t learned them too bad
• “We know best, it’s our job to know”
How bad design
happens
25. 25
So, how are we
deciding?
What’s important to our users?
What’s MVP?
Whether we’re creating problems
downstream?
27. 27
Garbage in- Garbage out
(applies to algorithms
too)
Examples:
• Racist (MSFT twitter bot made hate speech)
• Sexist (Apple Credit card favors men; Amazon hiring bot favored men)
• Transphobic (forms question: ‘male or female?’)
28. 28
Stress test
Does your algorithm build on and reinforce only popular
preferences?
Is your AI system able to evolve dynamically as your customers
changes over time?
Is your AI system helping your customers to have a more diverse and
inclusive view of the world?
Confirmation Bias- ‘proof’ confirms your beliefs
AI Tips from Microsoft “In Pursuit of Inclusive AI”
30. 30
For my Ethics in UX
webinar…
-
This start-up took webinar registration,
“promoted it” by passing registration through
his site (capture user details), then kicked user
out to my Eventzilla page, where the webinar
was sold out.
-
Growth Hacking anyone???
32. 32
Who owns this
problem?
Product Managers
UX Designers
Devs
Business Teams
Architecture decisions
Research Teams
Executives
Boards
Business models
Politicians
Regulators
Lawyers
33. 33
1) Inevitability (it’s coming so)
2) Neutrality (tech will find the solution so)
3) Utopia- technolust (it’s already here so)
4)Trust in algorithms (tech will find the solution;Tech is unbiased; this
computer said— the data said so…)
5) People will adapt (build and they will adopt it so)
Avoid the 5 Naiveties
38. 38
‘Agency by
Design’
(Proposed by Christopher Wylie)
Like Privacy by Design…
Platforms adopt Choice-enhancing Design
Ban Dark Patterns
Benefit to user = to feature
Prohibit ‘undue influence’ (eg long term
addiction)
Avoid harm principle
Abusability audits
39. 39
Rules
Contain:
Decisions
Choices
Logic (system requirements)
Intent (business requirements)
Most rules victimize the user by lack
of transparency (at the UI level).
Rule-interpreting is the enemy of
good UX.
40. 40
Control
How supported are user goals and
ability to define or change goals?
Do users really have control? (eg Does
Cancel really cancel?)
How easy is it to manage data
transparency? (eg Facebook privacy
settings took 100 iterations to get it right)
41. 41
Power
Do we take power away from a user
or community with this product or
service feature or behavior?
Are we giving power (and is it
wanted?)
Are we reducing power, and how?
Redefine ENGAGEMENT
42. 42
Changes required
BELIEFS- Justice, Inclusion,Wellness
APPROACHES- Human Centered/ Community centered
design
PRACTICES- Disclose how made, how assess /test
EVALUATE- Add rigor, scrutiny, integrity
Human centered design ethics
43. 43
Ethical UX
Designo Were users included? (Field Study, UserTesting)
o Who are stakeholders?
o Was consent given?
o Is this morally right or wrong?
o Do business interests prevent us from committing to
ethical UX?
o What are benefits?
o What are downsides?
o What are legal, PR, political repercussions?
o What are cultural upsides/ downsides (rules, behaviors,
trends)?
o What does this look like in 10 yrs time?
Users= Stakeholders or
Affected communities,
persons (identities) or
environment…
46. 46
Who are we impacting?
User
Community
Historical, Cultural, Political
Environmental
47. 47
QUIZ: What country gave all
rivers legal rights (of a
person)?
Thailand
Bangladesh
New Zealand
Canada
Kazi Salahuddin Razu/NurPhoto via Getty Images
48. 48
And at what cost?
Individual or community Rights
Harm identity or wellbeing
Harm environment
Legal rights
PR, Brand
49. 49
Red Team
o (Ethics audit)
o 1 day
o Rip apart other team’s design
“Ethical UX group Heuristic Evaluation”
Blue Team
50. 50
“What could possibly go
wrong?”
Bring user
concerns raised in
user interviews
and observations
(Ethnography)…
• Look at legal and past
precedents…
• Look at activist opinions
(eg privacy).
• Look at other countries or
regions (eg EU or Finland).
• Historical factors
• Cultural factors
Exploring Harm
51. 51
Ethical Areas to
checkUse a PESTLE Analysis: (also use in Localization projects)
PESTLE is a mnemonic which in its expanded form denotes P for Political, E for Economic, S for Social,
T for Technological, L for Legal and E for Environmental.
• What is the political situation of the country and how can it affect the industry?
• What are the prevalent economic factors?
• How much importance does culture have in the market and what are its determinants?
• What technological innovations are likely to pop up and affect the market structure?
• Are there any current legislations that regulate the industry or can there be any change in the legislations
for the industry?
• What are the environmental concerns for the industry?
See https://pestleanalysis.com/what-is-pestle-analysis/
52. 52
There are hundreds of examples of people finding ways
to use technology to harm themselves or other people,
and the response from so many tech CEOs has been,
'We didn't expect our technology to be used this way’
We need to try to think about the ways things can go
wrong. Not just in ways that harm us as a company, but
in ways that harm those using our platforms, and other
groups, and society… we need Abusability Testing
Former Chief Technologist of the FTC
Ashkan Soltani (2019 in Wired)
“
“
2020
54. 54
How to Test?
First, Ask
Ask extra critical folks in your org
Ask users
Ask SME’s
Ask lawyers
Ask intersectional UX experts
55. 55
Intersectional
means
Your UX person cares about users +
Racial Justice
Gender equality
LGBTQI+ rights
Environmental Justice
Accessibility
Sociability
USERS RIGHTS+
56. 56
How to Test?
Next,
Scan
Scan ethical violations
Map issues against ethical goals
Evaluate algorithms (stress test)
57. 57
o Societal level (legal, cultural)
o Business level (team, board)
o Product- Service level ($)
o UI level (motivation)
o Algorithm level (AI)
58. 58
Finally—and this is perhaps the most difficult
part—we work with ourselves. If we think
maybe we understand how other people tick,
we’re damn sure we know how we tick. It’s
hard to come to grips with the fact that we
don’t actually understand how we make our
own decisions. Our own biases are the thorn-
iest to spot and combat, and the most likely
to cause trouble for the people we design
with or for.
(2020 Thomas)
59. 59
LEARN from the Giants
(mistakes)
(and Silicon Valley and Start Up
culture)
62. 62
Just credit, Frank Spillers
CEO@ Experience Dynamics
I want you to give this
presentation to YOUR TEAM
That’s Right
63. 63
Author: Frank Spillers
Frank Spillers is the founder of Experience
Dynamics, a leading UX consulting firm
with Fortune 500 clients around the world.
For over 20 years, Frank has been an
internationally respected speaker, author,
Senior UX practitioner and UX Master
Trainer. He is a world expert in improving
the design and user experience of websites,
web applications, VR/AR and mobile apps
and services.
Teaching UX at Interaction Design
Foundation and at UXInnerCircle.com
Chief Experience Officer @ Experience Dynamics
OPEN SOURCE: FEEL FREE TO SHARE AND PRESENT TO YOUR TEAM
Any and all images are copyright their respective owners.
64. 64
RECORDING OF ORIGINAL
WEBINAR JAN 2021
AVAILABLE HERE:
PowerPoint File
AVAILABLE HERE: http://alturl.com/2gu4w
https://youtu.be/EJcF75fOBYU
If you need to cite this for a
class:
Spillers, Frank (2021). “Ethics for UX Crash Course”. Accessed on {data}
http://alturl.com/2gu4w