These are the slides from Amit Svarzenberg's workshop at Product of Things Conference in Tel Aviv on July 2018:
Who this workshop is for:
This workshop is for those who want to learn the process of prototyping and testing a virtual experience correctly, all without the need to use code!
After this workshop you will be able to:
- Understand how the human mind perceives reality and in what way you can manipulate it to create an immersive experience
- Create a prototype for a virtual reality application and a 360 experience for use on a regular cellular or a headset
- Use Unity3D to create experiences that will be adapted to each of the most advanced devices in the world (Oculus Rift, HTC Vive, Google CardBoard, etc.)
What is covered:
- How we perceive reality: learn and experience Sensor Fusion, which makes the user feel immersed in a new reality
- The UX questions the leading companies in the field ask and applying relevant UX principles: learn what’s important and how to apply it to Unity3D
- The latest developments in the field: be familiar with the latest tools, infrastructures, and technologies
- The process of prototyping and testing: create a prototype for a virtual reality application and a 360 experience for use on a regular cellular or headset
In this workshop, Amit will share his knowledge in the field of VR from his unique perspective as a product manager that diversified from software development.
5. “Virtual reality is an artificial environment that is created with
software and presented to the user in such a way that the user
suspends belief and accepts it as a real environment.”
Link
WHAT IS VR
10. MOBILE VR HEADSETS
•
• The little magnet on the side is actually a
quite ingenious design aspect of Google
Cardboard. It's a button!
• It uses your phone's magnetometer, which
is usually used for compass functions, to
sense this and control it while it's in the
cardboard
35. CONTROL
Users are not “passengers” in the
app. They should remain in
control of their movement inside
the app, so that they will be able
to actively what they
are about to see.
38. ANCHOR OBJECT
We are used to standing or
sitting still while the world is
moving around us, for example,
when driving. We don’t feel
nauseous because we have a
visual anchor- the car dashboard.
39. ALWAYS MAINTAIN HEAD TRACKING
Never stop tracking the user’s
head position inside of the
application. Even a short pause
in head tracking will cause some
users to feel ill.
41. RETICLE
Reticle help us understand what in the space is actionable
It can be a button, a dot, a circle, a hand…
SELECTION 1.5 - 2.0 SECS
http://share.framerjs.com/ojd9q3dg5xem/
42. RETICLE
• Avoid placing fuse buttons in close proximity
to each other. Fuse buttons work best if they
are large targets that are sufficiently far apart
from each other.
• Display the reticle only when the user
approaches a target that they can activate.
• Project a light source, or design obvious
hover states, for objects that the user can
target.
44. KEEPING THE USER GROUNDED
How can I tell who’s moving, me or the
objects around me?
• Objects we know and expect to be grounded
• Flags
• Using shadows
• Textured floor
45. MAKE IT BEAUTIFUL
• Design and modelling is
important for keeping the
experience immersive.
• 3D requires heavy production
work and so it cost more than
2D.
47. MENUS IN VR
• Hidden menu
• Menu as part of the virtual world
• Click Menu
• Swipe Menu
48. HIDDEN MENU
• At the top of the screen or at the bottom
• Fade in/out as the user move his head to look at the
screen
• To make the user aware of the menu, it is possible to
show a hint or a popup.
• When needed, it is possible to remind the user to use
the menu with a sound (the user learn to associate
this sound with a certain action) or a popup (“look
down to use the menu”)
49. A MENU AS PART OF THE VIRTUAL WORLD
Looks natural inside the
virtual world
Sometimes can look forced
I expect you to die - 3:20
50. CLICK MENU
• The virtual controller doesn’t act like
the real life controller and can cause
confusion
• Not a natural or fun experience
51. SWIPE MENU
• The user doesn’t have to focus on a single
button
• The controller can be used for several menus
• Intuitive once the user learn to use the controller
• The menu have no hierarchies
• Only 4 options at a time (left/right/up/down)
55. CURVED DESIGN
Put it into a VR environment, and it just
doesn’t work, especially if it’s too wide
or tall.
The edges of a flat surface will be
further away from the user’s eye focus,
making them blurry and hard to read.
65. UI DEPTH (EYE STRAIN)
Placement of the UI within the world also
needs some consideration. Too close to
the user can cause eye strain, and too
far away can feel like focusing on the
horizon - this might work in an outdoor
environment, but not in a small room.
66. UI DEPTH (EYE STRAIN)
It is recommended to place the text on a
radius of 2-3 meters from the user.
• Less than 2 meters Too close!
• 4-5 meters Far but possible.
• 6 meters and above Too far!
71. FOMO
Using FOMO(fear of missing out) to
guide the user.
For example using VR characters
gaze and face expression to intrigue
the user.
72. LIGHTNING
Effective lighting can draw a user in
to a scene or game, poor lighting is all
too often the first thing that causes
the entire thing to fall apart.
Lightning effects - highlighting a
specific area to get the users
attention (street light, firefly..)
74. DIFFERENT SIZES AND HEIGHTS IN VR
Using different sizes and heights in
the virtual world makes the user
curious, understanding that there is
a continuation and it is worthwhile
for him to look at what is around.
75. SOUND
Stereoscopic 3D can add another level of
immersion by adding depth data between the
foreground and background.
Audio instruction
Text instructions don’t perform well in virtual
reality for a number of reasons. Small text is
hard to read. Users are often overloaded with
visual information from the virtual
environment around them.
76. SURROUND SOUND
• Audio as background
• Using environmental audio to
make the application more
realistic, and to draw the
user’s attention to various
areas of the app.
77. GUIDING WITH HAPTIC FEEDBACK
Information from the sense of
touch .
Events such as a user touching
an object or interacting with
controls can benefit from it.
80. CONCIDERATIONS
Know your Audience
• Their hardware - Input, Processing power
• Their play area
• Experience with VR
• Length of play session
Locomotion in VR
Interaction - Input method
Focus on developing for a specific platform
81. THE "GOOD ENOUGH THRESHOLD"
A subjective above which the user’s brain is tricked to believe the virtual world is the “real”
world threshold:
Build from the ground up, Test, Iterate
User testing!!!
•
•
•
•
•
•
•