The Ubiquitous Eye Tracker: Eye tracking has become a key method to test the usability of websites and software. It provides researchers and practitioners with indisputable, objective and convincing data describing user behavior and usability problems. Eye tracking is also used to study user interaction with mobile devices and physical products. Eye tracking augments traditional usability methods, providing additional information that the test participant cannot report and the researcher cannot observe. Unique insights about first glance, search patterns, failed search, and much more offer guidance in how to solve different usability problems. Eye tracking can be used together with a variety of research methods, including observations, interviews and the retrospective think aloud (RTA) method. The past ten years, eye trackers have evolved from bulky, expensive and difficult to use devices, to a level of maturity that is soon good enough to be used in widespread consumer user experiences. The solutions of today enable more dynamic research setups and a wider range of uses, but what about tomorrow? What does it mean for us as UX professionals when gaze information from our users will be available all the time, everywhere?
26. AuraLamp. Jerrey S. Shell, Roel Vertegaal, and Alexander W. Skaburskis. Eyepliances:
attention-seeking devices that respond to visual attention. In CHI '03: CHI '03
extended abstracts on Human factors in computing systems, pages 770{771, New York, NY, USA, 2003. ACM Press.
32. Eye Tracking Social Tomorrow at 19:15-21:15
Tobii Glasses 2 – how will they affect the UX live viewing room?
Rob Smith, Tobii
Eye Tracking the mobile user experience
Andrew Schall, SPARK Experience
Aggregate eye tracking data is of little use in UX research
Peter Collins, Web Usability Partnership
Observing natural behavior in UX research
Guy Redwood, SimpleUsability
(Plus: free food and drinks!)
Editor's Notes
I have named this talk ”the ubiquitous eye tracker” so I thought I’d start with explaining what it is I really mean by that.
To explain that I want to talk about one thing that I love. And one thing that I hate. And in a way, they both have to do with interaction design.
Her, I love. Not just because of Scarlett’s lovely voice acting, but because I love the unintrusiveness of how interaction with technology is envisioned in this movie.
Sunglasses, I hate Sunglasses. Don’t get me wrong, they are very practical, and sometimes even necessary, but not being able to look someone in the eye or simply see what they are looking at is really discomforting. The reason I feel this way is of course that so much of the information we use to understand other people’s behavior comes from interpreting their gaze. How can we then expect to be able to understand our users without having this information?
In my opinion, the type of interaction envisioned in Her will never be possible to achieve without technology knowing what you are looking at and understanding why you are looking at it.
Today our gadgets and our surroundings keep track of what we do all the time in order to adapt better to what we want and need; online we are tracked and used for AB testing, websites collect information about us as to be able to provide better services. Our real world behavior is logged using GPS, check-ins, likes and whatnot. All of this so that the companies whose services we use can understand us better and adjust and improve their offerings to us. What makes this possible is the fact that the technology used to log this behavior is ubiquitous, or close to it. Carrying our phones with us everywhere and performing more and more activities online, allows this kind of collection of user behavior data.
Now, so far eye tracking has been very much an ad hoc endeavor. You make an investment in the necessary hard- and software. Tests are conducted in a lab or in some cases by moderators visiting test participants in a more natural environment. It requires a lot of manual set up and analyzing the data is very hands-on and time consuming. Of course relative; if you compare how it is today to how it was 10 years ago, the efficiency has improved a lot. But if you want to run a test on 2000 people it’s going to take quite a lot of work to do it. And if you want to do it every week or every day, it becomes impossible.
The past few years, web- and software development have moved to continuous delivery, releasing new updates every week, day or even hour. That, combined with the truly global online market, means that there is a very real need for continuous testing with user groups that are separated both geographically and demographically. A lot of companies do this today by testing changes on small groups of users before fully rolling them out, but including eye tracking in this type of testing is simply not feasible today. So what I mean by “the Ubiquitous Eye Tracker”, at least in the context of UX, is essentially ultra-quantifying eye tracking data.
So, how can we get to a point where something like this would be possible? I believe there are four areas where improvement is needed.
The technology. Today’s eye trackers are too bulky, too difficult to use and don’t track well enough to work in all the necessary conditions.
The price. Is an obvious factor.
The usefulness. For people to buy devices with eye tracking in them they need to provide tangible value.
And fourth - Knowledge. This is the most important one and I will get to that later.
If we start with looking at the technology. This beauty is the first eye tracker Tobii ever sold. It was in 2001 and kind of marked the start of modern eye tracking. It was built by hand and the eye tracking camera itself cost €2000 EUR from the manufacturer
If we fast forward 6 years we get to the T120. It looks a lot sleeker of course, but technically it is actually very similar to the older one. It has two of those bulky expensive cameras inside.
If we skip to 2014, we have this little thing, the SteelSeries Sentry, the world first consumer level eye tracker.
And of course this handsome beast, the second generation Tobii Glasses, which we are very proud to show off here at UXPA.
And that actually brings me to my cover slide. This is what it looked like. Does anyone have a guess as to what this fly speck might be? I actually have one right here, but it might be a bit tricky for you to see. This is one out of four eye tracking cameras that we use in the new Tobii Glasses. It’s about 1 mm squared. If you compare this little thing to the cameras used in older eye trackers it becomes quite obvious that technological advancement will not hinder the development of the ubiquitous eye tracker.
As for the price, I am just going to show this simple chart. This is the approximate price history of research grade eye trackers since the year 2000. Tobii has the explicit goal to equip every device with a build in eye tracker. For that to happen, the price point needs to be very close to 0. We are already seeing $100 eye trackers trying to enter the market, Samsung and Amazon are dabbling in rudimentary eye tracking in their phones. I think it is safe to say that the free eye tracker is coming.
The third point on the list was usefulness. For there to be eye trackers everywhere, there needs to be an incentive from a user perspective to buy devices with eye tracking sensors built in. So, why would anyone want to have an eye tracker in their laptop or tablet? This is an area that is being heavily explored on many fronts.
One example is of course the SteelSeries Sentry that I mentioned earlier, which will be used by gamers to analyze their own gaze patterns to improve their game play. Later on it will also users to control the computer games using their eyes.
I have already mentioned Amazon and Samsung, who use eye tracking in both explicit and implicit interaction in their phones.
We have Umoove that retrofits iOS devices with some rudimentary eye tracking.
Another pretty cool example is this: the Auralamp. You can turn this lamp on and off using a combination of gaze and voice control. Simply look at it and tell it to turn on or off. It’s not a finished product by any means, but it’s a nice example of possible future uses.
My personal favorites are less explicit interactions, similar to the 3D Amazon uses it for. To my point earlier - if your computer knows where you are looking, it can use this information to understand you better and anticipate your actions. For example it could start loading an application in the background. In addition to that there are of course loads of other use cases. While non-professional use of eye tracking is less mature of a field, I’d think it’s safe to say that there will be good reasons for people to want to have eye trackers in their devices.
Now to the fourth, last and most important point. Knowledge.
It was not long ago that a majority of usability practitioners main deliverable from eye tracking was a simple Heatmap.
As people’s understanding of eye tracking data is growing we are moving away from that to more interesting deliverables like actual tangible metrics or using live viewing only.
But how does one analyze eye tracking data from hundreds of thousands of people? They are not performing a task that you have given them, they are just behaving in their natural, erratic way. The area of big data and test driven design is if not adult, then at least in its teens in terms of maturity. The challenge that we will face, and by “we” I mean us as UX professionals, is building frameworks, software and an understanding for this data that allow us to tap into its full potential. How to do this is a question I am going to leave you with, because I don’t have the answer to it myself.