More Related Content Similar to Validating a smartphone-based pedestrian navigation system prototype - An informal eye-tracking pilot test Mario Kluge, Hartmut Asche University of Potsdam Similar to Validating a smartphone-based pedestrian navigation system prototype - An informal eye-tracking pilot test Mario Kluge, Hartmut Asche University of Potsdam (20) More from Beniamino Murgante More from Beniamino Murgante (20) Validating a smartphone-based pedestrian navigation system prototype - An informal eye-tracking pilot test Mario Kluge, Hartmut Asche University of Potsdam1. iccsa-2012
Validating a smartphone-based
pedestrian navigation system prototype
An informal eye-tracking pilot test
M. Kluge & H. Asche
University of Potsdam
Germany
ICCSA 2012, Salvador de Bahia, 19.06.2012
iccsa-2012
© kluge&asche·ifg·uni·potsdam·2012 1 | 20
2. iccsa-2012 Structure
Structure
1. Introduction
1 2
2. Concept
Introduction Concept 3. Eye-Tracking
4. Results
5. Summary
4 3
Results Eye-Tracking
5
Summary
© kluge&asche·ifg·uni·potsdam·2012 2 | 20
3. iccsa-2012
Chapter 1 | Introduction
iccsa-2012
© kluge&asche·ifg·uni·potsdam·2012 3 | 20
4. iccsa-2012 Chapter 1| Introduction
Pedestrian Navigation
Pedestrian traffic is an integrated part of traffic
Pedestrians are road users and passengers in public transport
existing navigation systems for pedestrians are based on 2D map
representations or depict reality as a 3D model
the majority of PNS on the market are smaller versions of VNS
Navigation problems occur if the user is unable to relate the
information of an instruction to the real environment
one reason is the visualization of the navigation instruction
size
Hypothesis:
content
The use of Augmented Reality corresponds to the appearance of human
perception and relates the AR instructions to prominent objects and
allows the implementation in the real environment.
time time
© kluge&asche·ifg·uni·potsdam·2012 4 | 20
5. iccsa-2012 Chapter 1| Introduction
Augmented Reality
AR allows the user to see the image of the real world
superimposed with virtual objects
AR is part of the virtual continuum of the mixed reality
mixed reality combines different forms of representation
AR supplements the environment with extensive information
instead of recreating or replacing it
Mixed Reality
Real Environment Augmented Reality Augmented Virtuality Virtual Environment
Source: Milgram & Kishino, 1994
© kluge&asche·ifg·uni·potsdam·2012 5 | 20
6. iccsa-2012
Chapter 2 | Concept
iccsa-2012
© kluge&asche·ifg·uni·potsdam·2012 6 | 20
7. iccsa-2012 Chapter 2| Concept
Conception
the central focus is the combination of reality and virtual reality
this common view will be extended by a virtual route
representation, which follows the route course in reality
the structure is as follows:
the perspective adjustment and the calculation of the virtual image
scene require a data model, which remains hidden from the viewer
the prototype is the result of a virtual route that is superimposed on
the front of the camera image
the instruction is displayed on the screen and perceived by the user
Map Data Virtual Camera RealityView Screen User
Route Image
© kluge&asche·ifg·uni·potsdam·2012 7 | 20
8. iccsa-2012 Chapter 2| Concept
Construction
the prototype has a modular construction and based on an free and
open source navigation platform (http://www.navit-project.org/)
the structure of the concept consists of the three processes
registration, tracking and presentation
registration process: captures position and alignment from sensors
tracking process: specifies the trace of virtual objects
presentation process: describes the output on the screen
User Presentation
Sensor Registration
Environmental conditions
GPS Alignment Augmented
Reality Display
Tracking
Acceleration
Orientation
Compass Graphical Virtual Video
Video Overlay Reality
System Route Stream
Camera
Hardware
© kluge&asche·ifg·uni·potsdam·2012 8 | 20
9. iccsa-2012 Chapter 2| Concept
RealityView
Presentation
the screen can be operated in two
display-modes (AR & map)
the activation or change is done by
changing the alignment of the unit
Virtual Cable
a virtual route will be positioned
with respect to the real location
Concept is taken from vehicle
navigation and called Virtual Cable
Features are described as “safe,
simple and intuitive”
navigates a user along a route
without a detailed description
© kluge&asche·ifg·uni·potsdam·2012 9 | 20
10. iccsa-2012
Chapter 3 | Eye-Tracking
iccsa-2012
© kluge&asche·ifg·uni·potsdam·2012 10 | 20
11. iccsa-2012 Chapter 3| Eye-Tracking
Eye-Tracking Technology
Technology
Eye-Tracking records the eye
Source: http://www.smivision.com/
movements and the point of gaze
is described as a non-invasive, video-
based technology
the use of a mobile eye-tracker
allows the recording in outdoor areas
the tracking process consists of the
following steps:
calibration procedure to measure the
properties of the eye
measure of the pupil center
locate the relative position of the
corneal reflection
calculate the direction of gaze
© kluge&asche·ifg·uni·potsdam·2012 11 | 20
12. iccsa-2012 Chapter 3| Eye-Tracking
Eye-Tracking Setup
Eye camera
Experimental Procedure
Scene camera only a pilot test was performed
Smartphone
camera
test sessions divided in three parts
Introduction
Test Run
Questionnaire
Experimental Setup
Eye and scene a user-centered eye camera
camera
captures the point of gaze
Helmet
Smartphone a scene camera records the real
Computer environment
both are mounted on a headgear
the recorded information is stored
by a mobile computer
© kluge&asche·ifg·uni·potsdam·2012 12 | 20
13. iccsa-2012 Chapter 3| Eye-Tracking
Eye-Tracking Evaluation
the evaluation allows scientific analysis of eye movement and
displays the duration of fixation in front of the video sequence
the following properties would be measured:
the pattern of various fixations (scan paths)
the time spent looking at display elements (growing circle)
the deployment of prominent objects in the real environment
© kluge&asche·ifg·uni·potsdam·2012 13 | 20
14. iccsa-2012
Chapter 4 | Results
iccsa-2012
© kluge&asche·ifg·uni·potsdam·2012 14 | 20
15. iccsa-2012 Chapter 4| Results
Test Run
© kluge&asche·ifg·uni·potsdam·2012 15 | 20
16. iccsa-2012 Chapter 4| Results
Test Run Results
with one exception, all test subjects had an identical favorite at
the individual stations
most of the users tried to use the AR-mode at decision points with
more than one possibility
the map display was used more often along two decision points
all of them used prominent objects to recognize the instruction
in the real environment
the most common navigation strategy of a user was to follow the
virtual cable in the AR-mode, and to display the target and
current location in the 2D map
conclusion:
the AR display provides a detailed navigation at decision points
the map display allows a better overview
© kluge&asche·ifg·uni·potsdam·2012 16 | 20
17. iccsa-2012 Chapter 4| Results
Questionnaire
Structure
the questionnaire based on the
System Usability Scale (SUS)
had an extent of ten questions
6
the rating based on the Likert-Scale
response
5
4
3
the validation results provided a
2 mean score of 75 out of 100 points
1
0 a SUS score of 75 points can be
1
Person 1
2 3 4
Person 2
5 6 7
Person 3
8 9
question
10
interpreted as a grade of B
all test subjects rate the third
70%
statement, "I thought the system
75
was easy to use" with value 5
in conclusion, the requirement of
intuitive use has been fulfilled
© kluge&asche·ifg·uni·potsdam·2012 17 | 20
18. iccsa-2012
Chapter 5 | Summary
iccsa-2012
© kluge&asche·ifg·uni·potsdam·2012 18 | 20
19. iccsa-2012 Chapter 5| Summary
Summary
the paper described the conceptual structure of an AR-based PNS
and the validation based on eye-tracking technology
the validation of the prototype confirms the hypothesis that the
use of Augmented Reality for pedestrian navigation is possible and
is also accepted by the user group
the evaluation of an eye-tracking pilot study proved that the use of
AR favors the selection of prominent objects in the environment
Outlook
a key role in the success of future developments is located in the
precision and quality of the hardware devices and especially to the
accuracy of the built-in sensor components
a meaningful evaluation of the prototype requires a quantitative
evaluation over a longer period and a repetition and expansion of
the presented eye-tracking pilot test
© kluge&asche·ifg·uni·potsdam·2012 19 | 20
20. iccsa-2012
Thank You
for your attention!
Mario Kluge & Hartmut Asche
University of Potsdam
Department of Geography
mario.kluge@uni-potsdam.de
hartmut.asche@uni-potsdam.de
iccsa-2012
© kluge&asche·ifg·uni·potsdam·2012 20 | 20