Eye Tracking

Understanding user attention for better information presentation

The Eye Tracking project seeks to understand where users focus their attention for better presentation of information.

More and more advanced sensors are today placed in our devices or in our work environment. These sensors makes it possible to detect and estimate people’s attention during problem solving and work. With a more detailed knowledge about user’s attention, computing system can provide more accurate and timely feedback to user’s current attentive resources. In our eye tracking project, we analyze user interaction and attention with the aim of developing methods for using gaze data as an additional source of information about the user.

In the Eye Tracking project we have explored feedback method to support visual search tasks, user’s attention and activities while searching for information in multimedia documents or while browsing photos and videos.

Related Publications

2015

Creating Gaze Annotations in Head Mounted displays

Publication Details
  • International Symposium on Wearable Computers (ISWC)
  • Sep 8, 2015

Abstract

Close
To facilitate distributed communication in mobile settings, we developed a system for creating and sharing gaze anno-tations using head mounted displays, such as Google Glass. Gaze annotations make it possible to point out objects of interest within an image and add a verbal description to it. To create an annotation, the user simply looks at an object of interest in the image and speaks out the information connected to the object. The gaze location is recorded and inserted as a gaze marker and the voice is transcribed using speech recognition. After an annotation has been created, it can be shared with another person. We performed a user study that showed that users experienced that gaze annota-tions add precision and expressiveness compared to an annotation to the whole image.
2013

Looking Ahead: Query Preview in Exploratory Search

Publication Details
  • SIGIR 2013
  • Jul 28, 2013

Abstract

Close
Exploratory search is a complex, iterative information seeking activity that involves running multiple queries, finding and examining many documents. We introduced a query preview interface that visualizes the distribution of newly-retrieved and re-retrieved documents prior to showing the detailed query results. When evaluating the preview control with a control condition, we found effects on both people’s information seeking behavior and improved retrieval performance. People spent more time formulating a query and were more likely to explore search results more deeply, retrieved a more diverse set of documents, and found more different relevant documents when using the preview. With more time spent on query formulation, higher quality queries were produced and as consequence the retrieval results improved; both average residual precision and recall was higher with the query preview present.
2010
Publication Details
  • Symposium on Eye Tracking Research and Applications 2010
  • Mar 22, 2010

Abstract

Close
In certain applications such as radiology and imagery analysis, it is important to minimize errors. In this paper we evaluate a structured inspection method that uses eye tracking information as a feedback mechanism to the image inspector. Our two-phase method starts with a free viewing phase during which gaze data is collected. During the next phase, we either segment the image, mask previously seen areas of the image, or combine the two techniques, and repeat the search. We compare the different methods proposed for the second search phase by evaluating the inspection method using true positive and false negative rates, and subjective workload. Results show that gaze-blocked configurations reduced the subjective workload, and that gaze-blocking without segmentation showed the largest increase in true positive identifications and the largest decrease in false negative identifications of previously unseen objects.
2009

Gaze-aided human-computer and human-human dialogue

Publication Details
  • Book chapter in Handbook of Research on Socio-Technical Design and Social Networking Systems, eds. Whitworth B., and de Moor, A. Information Science Reference, pp. 529-543.
  • Mar 2, 2009

Abstract

Close
Eye-gaze plays an important role in face-to-face communication. This chapter presents research on exploiting the rich information contained in human eye-gaze for two types of applications. The first is to enhance computer mediated human-human communication by overlaying eye-gaze movement onto the shared visual spatial discussion material such as a map. The second is to manage multimodal human-computer dialogue by tracking the user's eye-gaze pattern as an indicator of user's interest. We briefly review related literature and summarize results from two research projects on human-human and human-computer communication.
2005
Publication Details
  • M.F. Costabile and F. Paternò (Eds.): INTERACT 2005, LNCS 3585
  • Sep 12, 2005

Abstract

Close
We developed and studied an experimental system, RealTourist, which lets a user to plan a conference trip with the help of a remote tourist consultant who could view the tourist's eye-gaze superimposed onto a shared map. Data collected from the experiment were analyzed in conjunction with literature review on speech and eye-gaze patterns. This inspective, exploratory research identified various functions of gaze-overlay on shared spatial material including: accurate and direct display of partner's eye-gaze, implicit deictic referencing, interest detection, common focus and topic switching, increased redundancy and ambiguity reduction, and an increase of assurance, confidence, and understanding. This study serves two purposes. The first is to identify patterns that can serve as a basis for designing multimodal human-computer dialogue systems with eye-gaze locus as a contributing channel. The second is to investigate how computer-mediated communication can be supported by the display of the partner's eye-gaze.