We had a number of demonstrations given publicly. Most of them were performed live and not recorded. Here we present a collection of those that were captured and uploaded to Youtube or similar sites. It is planned that software that is ready to be used outside the research group will be published here as well.

ETRA 2021 keynote talk by Päivi Majaranta

Eyeglasses once revolutionized human vision by offering an easy, non-invasive way to compensate for deficiencies in vision. Similarly, multimodal wearable technology may act as a facilitator to non-invasively augment human senses, action, and cognition – seamlessly, as if the enhancements were part of our natural abilities. In many scenarios, gaze plays a crucial role due to its unique capability to convey the focus of interest. Gaze has already been used in assistive technologies to compensate for impaired abilities. There are lessons learned that can be applied in human augmentation. In this talk, the speaker summarizes some key lessons learned from the research conducted over the past decades. She also discusses the role of gaze in multimodal interfaces and examine how different types of eye movements, together with other modalities, such as haptics, can support this intention. The talk end with a call for research to realize the vision for an augmented human.

Play video on YouTube (opens in new tab)

This video is from the LEAD ME Summer Training School Warsaw 2021. Slides

Supporting Making Fixations and the Effect on Gaze Gesture Performance

Gaze gestures are deliberate patterns of eye movements that can be used to invoke commands. These are less reliant on accurate measurement and calibration than other gaze-based interaction techniques. These may be used with wearable displays fitted with eye tracking capability, or as part of an assistive technology. The visual stimuli in the information on the display that can act as fixation targets may or may not be sparse and will vary over time. The paper describes an experiment to investigate how the amount of information provided on a display to assist making fixations affects gaze gesture performance. The impact of providing visualization guides and small fixation targets on the time to complete gestures and error rates is presented. The number and durations of fixations made during gesture completion is used to explain differences in performance as a result of practice and direction of eye movement.

Play video on YouTube (opens in new tab)

EyeSketch: A drawing application for gaze control

Henna Heikkilä is demonstrating the application called “Eye Sketch” she developed to draw using eyes only. Her application utilizes drawing objects that can be moved and resized, and their color attributes can be changed after drawing. Tool and object selections are implemented with dwell buttons. Tools for moving and resizing are controlled with gaze gestures and by closing the eyes. Gaze gestures are simple, one-segment gestures that end outside the screen area. The gestures are used to give the direction for moving and also for the command to make the object either smaller or larger. Closing the eyes signals to the application to stop a moving object. In an evaluation study, these gaze gestures were judged as a usable interaction style for the moving and resizing purpose.

Play video on YouTube (opens in new tab)

Tic-tac-toe game played using eye-gaze only

Turn-based games are the most trivial to play using gaze only. We have developed few board games that could be played using any of the eye-tracking device available in our lab. Tic-tac-toe game is the easier to start for novices who had never controlled anything by their eye movements,. The game has only 3×3 set of large buttons (gaze-active areas), thus is very resistance to calibration inaccuracies and gaze offsets.

Play video on YouTube (opens in new tab)

Old Skype version controlled using gaze

Old version of Skype had a public API that could be used to construct custom client applications. We developed such one with a contact list large enough to hit by gaze pointer, that also could be scrolled if the list does not fit on the screen. Users could start and stop calls, answer to the incoming call, but also to write text messages with the onscreen keyboard also developed as a part of the application. Besides letters, numbers and other traditional keyboard signs, the keyboard contained a set of emoji icons and tools to edit text.

Play video on YouTube (opens in new tab)

Silko, a gaze-enabling tool for teachers and pupils

Silko is a web application developed by researchers of the University of Tampere. The application allows schoolteachers to prepare online reading tasks for their students and then inspect their reading process. Students complete tasks with an eye tracker connected to their PC. They may get reading support, like long-gazed words hyphenated as they read, if the teacher has enabled this feature for the task. Teachers may also request students to complete a questionnaire after the reading task. A questionnaire may contain general question about the task text to check student’s comprehension, or be related to specific words if they were gazed at long enough.

Several visualizations of students’ gaze paths are available for teachers. They may inspect an individual gaze path in a traditional gaze-plot visualization, or compare reading performance of several students by replaying their gaze paths or word focusing events. Teachers may analyze numerical indicators, such as reading speed, number of reading regressions, average fixation duration and so on. In addition, another statistics shows to the teacher the words that caused reading difficulties for most students.