Difference between revisions of "Gaze-Controlled Games"

From COGAIN: Communication by Gaze Interaction (hosted by the COGAIN Association)
Jump to navigationJump to search
 
(10 intermediate revisions by 2 users not shown)
Line 274: Line 274:
 
==== Eye Trackers ====
 
==== Eye Trackers ====
  
Catalogue of currently available eye trackers for interactive applications. <br /> Link: [[../../eyetrackers/index.html http://www.cogain.org/eyetrackers/]]
+
Catalogue of currently available eye trackers for interactive applications. <br /> Link: [[http://www.cogain.org/wiki/Eye_Trackers http://www.cogain.org/wiki/Eye_Trackers]]
  
 
==== Gaze-aware Space Vampires (by Chris Schmelzle) ====
 
==== Gaze-aware Space Vampires (by Chris Schmelzle) ====
Line 288: Line 288:
 
==== YouTube video clips on gaze &amp; games ====
 
==== YouTube video clips on gaze &amp; games ====
  
 +
* [http://www.getacd.org/listen_YPbhSoj3ZFM/bejeweled_2_advanced_player_eye_tracking_study Bejeweled 2 Advanced Player Eye-tracking Study], video & information
 +
* [http://www.youtube.com/watch?v=6PZpsWzjnvE Dreamhack Eye-Tracking Experiment], by the BTH game research group.
 
* [http://www.youtube.com/watch?v=3pRWYE2LRhk Eye Based Video Game Control: Quake 2]
 
* [http://www.youtube.com/watch?v=3pRWYE2LRhk Eye Based Video Game Control: Quake 2]
 
* [http://www.youtube.com/watch?v=IX6H83ZgYGE Eye Based Video Game Control: Neverwinter Nights]
 
* [http://www.youtube.com/watch?v=IX6H83ZgYGE Eye Based Video Game Control: Neverwinter Nights]
 
* [http://www.youtube.com/watch?v=3JkdFFxdlsw Eye Based Video Game Control: Missile Command]
 
* [http://www.youtube.com/watch?v=3JkdFFxdlsw Eye Based Video Game Control: Missile Command]
* [http://www.youtube.com/watch?v=XwMoAqgikRM Solitaire with an eye-tracker]
+
* [http://www.youtube.com/watch?v=qbotg30L0rc Eye Gaze Computer Game Crazy Taxi via Tobii PCEye]
* [http://www.youtube.com/watch?v=QmvrR5z4NOA Playing Unreal Tournament with an eye-tracker]
 
* [http://www.youtube.com/watch?v=lGehsY7pcrc House of the Dead with Eye Tracking]
 
 
* Eye Gaze Driven Second Life - [http://www.youtube.com/watch?v=ClPAFITx9yY Camera], [http://www.youtube.com/watch?v=UFrvl-eFsAQ&eurl Locomotion]. <br /> See also http://www.cse.dmu.ac.uk/~svickers/scvideos.html
 
* Eye Gaze Driven Second Life - [http://www.youtube.com/watch?v=ClPAFITx9yY Camera], [http://www.youtube.com/watch?v=UFrvl-eFsAQ&eurl Locomotion]. <br /> See also http://www.cse.dmu.ac.uk/~svickers/scvideos.html
 
* [http://www.youtube.com/watch?v=NBIjWA8CHls Eye Gaze Interaction with World of Warcraft]
 
* [http://www.youtube.com/watch?v=NBIjWA8CHls Eye Gaze Interaction with World of Warcraft]
 +
* [http://www.youtube.com/watch?v=0Nz68kz51Os Gaze-Controlled Applications at University of Tampere: Board Games]
 
* [http://www.youtube.com/watch?v=ldw3HugJ2rE Gaze-Controlled First-Person-Shooter], Eye Tracking with IntelliGaze(tm) technology
 
* [http://www.youtube.com/watch?v=ldw3HugJ2rE Gaze-Controlled First-Person-Shooter], Eye Tracking with IntelliGaze(tm) technology
* [http://www.youtube.com/watch?v=6PZpsWzjnvE Dreamhack Eye-Tracking Experiment], by the BTH game research group.
+
* [http://www.youtube.com/watch?v=lGehsY7pcrc House of the Dead with Eye Tracking]
* [http://www.getacd.org/listen_YPbhSoj3ZFM/bejeweled_2_advanced_player_eye_tracking_study Bejeweled 2 Advanced Player Eye-tracking Study], video & information
 
 
* [http://www.youtube.com/watch?v=OfnQDJW6xXA Interaction using Gaze Direction], Gaze (for direction changes) + key press
 
* [http://www.youtube.com/watch?v=OfnQDJW6xXA Interaction using Gaze Direction], Gaze (for direction changes) + key press
 +
* [http://www.youtube.com/watch?v=3j2bEWGkDKE Playing Angry Birds with Gaze Control], IntelliGaze, Desktop 2.0
 +
* [http://www.youtube.com/watch?v=QmvrR5z4NOA Playing Unreal Tournament with an eye-tracker]
 +
* [http://www.youtube.com/watch?v=XwMoAqgikRM Solitaire with an eye-tracker]
 +
* [http://www.youtube.com/watch?v=i0xgyBNVrHk Tobii EyeAsteroids eye-controlled arcade game]
 +
* [http://www.youtube.com/watch?v=tfxxoN_RbJ8 Low Cost Eye Tracking for Commercial Gaming: eyeAsteroids and eyeShoot in the Dark!]
  
 
=== Organisations ===
 
=== Organisations ===

Latest revision as of 07:21, 11 November 2011

Online information resources on how to use gaze for the control of games and other leisure applications

Contents

Papers I: Evaluation of using gaze control in games and other leisure applications

Gaze Controlled Games

Abstract:
The quality and availability of eye tracking equipment has been increasing while costs have been decreasing. These trends increase the possibility of using eye trackers for entertainment purposes. Games that can be controlled solely through movement of the eyes would be accessible to persons with decreased limb mobility or control. On the other hand, use of eye tracking can change the gaming experience for all players, by offering richer input and enabling attention-aware games. Eye tracking is not currently widely supported in gaming, and games specifically developed for use with an eye tracker are rare. This paper reviews past work on eye tracker gaming and charts future development possibilities in different sub-domains within. It argues that based on the user input requirements and gaming contexts, conventional computer games can be classified into groups that offer fundamentally different opportunities for eye tracker input. In addition to the inherent design issues, there are challenges and varying levels of support for eye tracker use in the technical implementations of the games.

Reference:
Isokoski P., Joos, M., Spakov, O., & Martin, B. (2009). Gaze Controlled Games. Universal Access in the Information Society 8(4). Springer.
Link: [1]

Eye Tracker Input in First Person Shooter Games

Abstract:
We report ongoing work on using an eye tracker as an input device in first person shooter (FPS) games. In these games player moves in a three-dimensional virtual world that is rendered from the player's point of view. The player interacts with the objects he or she encounters mainly by shooting at them. Typical game storylines reward killing and punish other forms of interaction. The reported work is a part of an effort to evaluate a range of input devices in this context. Our results on the other devices in the same game allow us to compare the efficiency of eye trackers as game controllers against more conventional devices. Our goal regarding eye trackers is to see whether they can help players perform better. Some FPS games are played competitively over the Internet. If using an eye tracker gives an edge in competitive play, players may want to acquire eye tracking equipment. Eye trackers as input devices in FPS games have been investigated before (Jönsson, 2005), but that investigation focused on user impressions rather than on the efficiency and effectiveness of eye trackers in this domain. However, Jönsson's results on eye tracker efficiency in a non-FPS game were encouraging.

Reference:
Isokoski, P., & Martin, B. (2006). Eye Tracker Input in First Person Shooter Games. In Proceedings of COGAIN 2006: Gazing into the Future, 78-81.
Link: http://www.cs.uta.fi/~poika/cogain2006/cogain2006.pdf

Use of Eye Movements for Video Game Control

Abstract:
We present a study that explores the use of a commercially available eye tracker as a control device for video games. We examine its use across multiple gaming genres and present games that utilize the eye tracker in a variety of ways. First, we describe a first person shooter that uses the eyes to control orientation. Second, we study the use of eye movements for more natural interaction with characters in a role playing game. And lastly, we examine the use of eye tracking as a means to control a modified version of the classic action/arcade game Missile Command. Our results indicate that the use of an eye tracker can increase the immersion of a video game and can significantly alter the gameplay experience.

Reference:
Smith, J. D., & Graham, T. C. N. (2006). Use of Eye Movements for Video Game Control. In ACM Advancements in Computer Entertainment Technology (Hollywood, CA, USA, June 14 - 16, 2006). ACE 2006. ACM Press, New York, NY.
Link: http://www.cs.queensu.ca/~smith/papers/ace2006.pdf

If Looks Could Kill - An Evaluation of Eye Tracking in Computer Games

Abstract:
The possibility to track human eye gaze is not new. Different eye tracking devices have been available for several years. The technology has for instance been used in psychological research, usability evaluation and in equipment for disabled people. The devices have often required the user to utilize a chinrest, a bite board or other cumbersome equipment. Hence, the use of eye tracking has been limited to restricted environments.

In recent years, new non-intrusive eye tracking technology has become available. This has made it possible to use eye tracking in new, natural environments. The aim of this study was to evaluate the use of eye tracking in computer games. A literature study was made to gather information about eye tracker systems, existing eye gaze interfaces and computer games. The analysis phase included interviews with people working with human-computer interaction and game development, a focus group session and an evaluation of computer games. The result from the analysis constituted of a summary of interaction sequences, presumable suitable to control with the eyes. Three different prototypes of eye controlled computer games were developed. The first was a shoot'em up game where the player aimed with his eyes to shoot monsters that appeared in random places. The two other prototypes were developed with the Half Life Software Development Kit. In the first Half Life prototype, the player aimed a weapon with his eyes. In the second, the view of sight was controlled with the eyes. The different eye controlled game prototypes were evaluated in a usability study. The subjects played the different prototypes with mouse and eyes respectively. Their experience was evaluated with the thinking aloud method, questionnaires and an interview. The result showed that interaction with the eyes is very fast, easy to learn and perceived to be natural and relaxed. According to the usability study, eye control can provide a more fun and committing gaming experience than ordinary mouse control. Eye controlled computer games is a very new area that needs to be further developed and evaluated. The result of this study suggests that eye based interaction may be very successful in computer games.

Reference:
Jönsson, E. (2005). If Looks Could Kill - An Evaluation of Eye Tracking in Computer Games. Master's Thesis, Department of Numerical Analysis and Computer Science, Royal Insittute of Technology, Stockholm, Sweden.
Link:
http://www.nada.kth.se/utbildning/grukth/exjobb/rapportlistor/2005/rapporter05/jonsson_erika_05125.pdf

EyeChess: the tutoring game with visual attentive interface

Abstract:
Advances in eye tracking have enabled the physically challenged people to type, draw, and control the environment with their eyes. However, entertainment applications for this user group are still few. The EyeChess project described in this paper is a PC based tutorial to assist novices in playing chess endgames. The player always starts first and has to checkmate the Black King in three moves. First, to make a move the player selects a piece and then its destination square. To indicate that some squares could be activated, while other ones were forbidden for selection, color highlighting was applied. A square with a green highlight indicated a valid action, and the red color denoted invalid action. There were three options to make a selection: blinking, eye gesture (i.e., gazing at offscreen targets), and dwell time. If the player does not know how to solve the task, or s/he plays by making mistakes, the tutorial provides a hint. This shows up a blinking green highlight when the gaze points at the right square. Preliminary evaluation of the system revealed that dwell time was the preferred selection technique. The participants reported that the game was fun and easy to play using this method. Meanwhile, both the blinking and eye gesture methods were characterized as quite fatiguing. The tutorial was rated helpful in guiding the decision making process and training the novice users in gaze interaction.

Reference:
Spakov, O. (2005). EyeChess: the tutoring game with visual attentive interface. Alternative Access: Feelings and Games 2005, Department of Computer Sciences, University of Tampere, Finland.
Link: http://www.cs.uta.fi/~oleg/

EyeDraw: A System for Drawing Pictures with Eye Movements

Abstract:
This paper describes the design and development of EyeDraw, a software program that will enable children with severe mobility impairments to use an eye tracker to draw pictures with their eyes so that they can have the same creative developmental experiences as nondisabled children. EyeDraw incorporates computer-control and software application advances that address the special needs of people with motor impairments, with emphasis on the needs of children. The contributions of the project include (a) a new technique for using the eyes to control the computer when accomplishing a spatial task, (b) the crafting of task-relevant functionality to support this new technique in its application to drawing pictures, and (c) a user-tested implementation of the idea within a working computer program. User testing with nondisabled users suggests that we have designed and built an eye-cursor and eye-drawing control system that can be used by almost anyone with normal control of their eyes. The core technique will be generally useful for a range of computer control tasks such as selecting a group of icons on the desktop by drawing a box around them.

Reference:
Hornof, A., Cavender, A., & Hoselton, R. (2004). EyeDraw: A System for Drawing Pictures with Eye Movements. Proceedings of the ACM SIGACCESS Conference on Computers and Accessibility.
Link: http://www.cs.uoregon.edu/~hornof/downloads/ASSETS04.pdf

Design of a computer game using an eye-tracking device for eye's activity rehabilitation

Abstract:
An eye mouse interface that can be used to operate a computer using the movement of the eyes is described. We developed this eye-tracking system for eye motion disability rehabilitation. When the user watches the screen of a computer, a charge-coupled device will catch images of the user's eye and transmit it to the computer. A program, based on a new cross-line tracking and stabilizing algorithm, will locate the center point of the pupil in the images. The calibration factors and energy factors are designed for coordinate mapping and blink functions. After the system transfers the coordinates of pupil center in the images to the display coordinate, it will determine the point at which the user gazed on the display, then transfer that location to the game subroutine program. We used this eye-tracking system as a joystick to play a game with an application program in a multimedia environment. The experimental results verify the feasibility and validity of this eye-game system and the rehabilitation effects for the user's visual movement.

Reference:
Lin, C.-S., Huan C.-C., Chan C.-N., Yeh, M.-S., & Chiu C.-C. (2004). Design of a computer game using an eye-tracking device for eye's activity rehabilitation, Optics and lasers in engineering, 42(1), 91-108, Elsevier.
Link:
http://www.foylearts.net/jmagee/Bdes/des514m1/06brf514/Lin%20et%20al%202002-%20eye%20tracking.pdf

Eye Tracking as an Aiming Device in a Computer Game

Abstract:
This paper describes an experiment in the application of eye tracking to facilitate aiming in computer gaming. A simple 3D computer game was run under varying conditions to test the effect of gaze-contingent gaming on a player’s performance, measured by accuracy in selecting targets within the game and completion time. The game was run using a traditional mouse and with the Tobii ET-1750 eye tracker as aiming devices in timed and un-timed trials. The results showed that subjects had better performance in completing their objectives when using the mouse instead of the eye-tracker as an aiming device. However, difficulties with the calibration process suggest that the experiment may yield different results if run with a modified calibration process.

Reference:
Leyba, J. and Malcolm, J. (2004) Eye Tracking as an Aiming Device in a Computer Game. Course work (CPSC 412/612 Eye Tracking Methodology and Applications by A.Duchowski), Clemson University.
Link:
http://andrewd.ces.clemson.edu/courses/cpsc412/fall04/teams/reports/group2.pdf

Eye movements in an Action Game Tutorial

Abstract:
Action games are controversial and discussed, at the same time they fascinate players all over the world. One way to find out what this attraction is about is to use eye tracking to explore them. This method can show explicit eye gaze direction within the game environment and at the same time point out what the mind determine as important in the different interactions in an action game tutorial. This study wants to lay out the foundations of players´ eye behaviours in the light of training, learning, social behaviour and if there are any visual reinforcements between interactive media compared to a natural situation. Action games are today classified as entertainment products with built in simulation paths at the same time as some organisations bring in commercial games for professional training or evaluating its profit. A study made last year at Rochester University showed that non-video game players could improve their visual attention. In this study, eight subjects were playing and the recording tracked every eye movement and step in choice. The results revealed that facial interest is secondary in task progression, eye behaviour patterns are similair to eye behaviour in car driving and re-fixations occured after search and shooting partly independent of background.

Reference:
Sennersten, C. (2004). Eye movements in an Action Game Tutorial. Student Paper. Department of Cognitive Science. Lund University, Sweden.
Link: http://www.sol.lu.se/humlab/eyetracking/Studentpapers/CharlotteSennersten.pdf

3D First Person Shooting Game by Using Eye Gaze Tracking

Abstract:
In this paper, we propose the method of manipulating the gaze direction of 3D FPS game's character by using eye gaze detection from the successive images captured by USB camera, which is attached beneath HMB. The proposed method is composed of 3 parts. At first, we detect user's pupil center by real-time image processing algorithm from the successive input images. In the second part of calibration, when the user gaze on the monitor plane, the geometric relationship between the gazing position of monitor and the detected position of pupil center is determined. In the last part, the final gaze position on the HMD monitor is tracked and the 3D view in game is controlled by the gaze position based on the calibration information. Experimental results show that our method can be used for the handicapped game player who cannot use his(or her) hand. Also, it can Increase the interest and the immersion by synchronizing the gaze direction of game player and the view direction of game character.

Reference:
Lee, Eui-Chul and Park, Kang-Ryoung (2005) The KIPS transactions. Part B, Volume b12, Issue 4, Korea Information Processing Society (August 2005, ISSN 1598-284x), pp. 465-472.
Link: http://www.koreascience.or.kr/article/articleresultdetail.jsp?no=34663577&searchtype=JSB&listlen=18&listno=13

A preliminary investigation into eye gaze data in a first person shooter game

Abstract:
This paper describes a study carried out in which the eye gaze data of several users playing a simple First Person Shooter (FPS) game has been recorded. This work shows the design and implementation of a simple game and how the execution of the game can be synchronized with an eye tracking system. The motivation behind this work is to determine the existence of visual psycho-perceptual phenomena, which may be of some use in developing appropriate information limits for distributed interactive media compression algorithms. Only 2 degrees of the 140 degrees of human vision has a high level of detail. It may be possible to determine the areas of the screen that a user is focusing on and render it in high detail or pay particular attention to its contents so as to set appropriate dead reckoning limits. Our experiment shows that eye tracking may allow for improvements in rendering and new compression algorithms to be created for an online FPS game.

Reference:
Kenny, A., Koesling, H., Delaney, D., McLoone, S. and Ward, T. (2005) A preliminary investigation into eye gaze data in a first person shooter game, in Proceedings of the 19th European Conference on Modelling and Simulation (ECMS '05), Riga, Latvia, June 2005.
Link: http://eprints.nuim.ie/282/

Verification of an Experimental Platform Integrating a Tobii Eyetracking System with the HiFi Game Engine

Abstract:
Playing a commercial PC or console game is a highly visual activity, regardless of whether the purpose is entertainment or situated learning as discussed in the Serious Games field. If more information about the visual attention of the player can be recorded and easily analysed, important design information can be extracted. A range of different eyetracking equipment exists on the market and has been used in many studies over the years. However, very few studies describe dynamic stimuli involving the visual interaction of the user/player with a moving 3D object displayed on a computer screen. The reasons for this are that methods and software developed for eyetracking studies of static 2D stimuli are inappropriate for dynamic 3D stimuli, and manual analysis of dynamic 3D visual interaction is extremely time consuming. In order to address this, the authors have developed a software interface between the Tobii(TM) eyetracking system and the HiFi Game Engine for use in automated logging of dynamic 3D objects of gaze attention. This report describes the verification study performed to assess the performance of this integration between the eyetracker, logging tools and game engine. Detailed analysis shows effective results within the derived accuracy range, which is certainly sufficient for studies from a small scale to large scales necessary for extensive statistical analysis. The work presented in the report has been conducted in collaboration between FOI, Blekinge Institute of Technology and Gotland College.

Reference:
Sennersten, C., Alfredseon, J., Castor, M., Hedström, J., Lindhal, B, Lindley, C., and Svensson, E. (2007) Verification of an Experimental Platform Integrating a Tobii Eyetracking System with the HiFi Game Engine. Command and Control Systems, Methodology Report, FOI-R--2227-SE, ISSN 1650-1942, FOI Devence Research Agency, February 2007.
Link: http://www2.foi.se/rapp/foir2227.pdf

Gaze vs. Mouse: An evaluation of user experience and planning in problem solving games

Abstract:
The aim of this thesis is to investigate whether gaze-based interaction is a suitable means of input for problem solving games. Where a player has to use his/her eyes not only to select objects, but also to visually perceive the puzzle and plan his/her next move in order to solve the puzzle. Two common problem solving puzzles were implemented, the Sudoku and the Tile Slide puzzle (or 15 puzzle). Each puzzle can be played with eye gaze or with the mouse. Although test subjects found gaze interesting, the mouse was still the preferred mode of interaction. We found that gaze selection is more erroneous than mouse selection and that these errors can cause a player to lose concentration from the task at hand. We also found that the user interface and the interaction sequence influences both the planning strategy that the player would use and the amount of time it takes him/her to complete the task.

Reference:
Gowases, T. (2007) Gaze vs. Mouse: An evaluation of user experience and planning in problem solving games. Master’s thesis May 2, 2007. Department of Computer Science, University of Joensuu, Finland.
Link: ftp://cs.joensuu.fi/pub/Theses/2007_MSc_Gowases_Tersia.pdf

Eye gaze assistance for a Game-like interactive task

Abstract:
Human beings communicate in abbreviated ways dependent on prior interactions and shared knowledge. Furthermore, humans share information about intentions and future actions using eye gaze. Among primates, humans are unique in the whiteness of the sclera and amount of sclera shown, essential for communication via interpretation of eye gaze. This paper extends our previous work in a Game-like interactive task by the use of computerised recognition of eye gaze and fuzzy signature based interpretation of possible intentions. This extends our notion of robot instinctive behaviour to intentional behaviour. We show a good improvement of speed of response in a simple use of eye gaze information. We also show a significant and more sophisticated use of the eye gaze information, which eliminates the need for control actions on the user’s part. We also make a suggestion as to returning visibility of control to the user in these cases.

Reference:
Tom Gedeon, Dingyun Zhu, and Sumudu Mendis (2008) Eye gaze assistance for a Game-like interactive task. International Journal of Computer Games Technology.
Link: http://www.hindawi.com/journals/ijcgt/aip.623725.html

Invisible eni: using gaze and pupil size to control a game

Abstract:
We present an eyes-only computer game, Invisible Eni, which uses gaze, blinking and as a novelty pupil size to affect game state. Pupil size can be indirectly controlled by physical activation, strong emotional experiences and cognitive effort. Invisible Eni maps the pupil size variations to the game mechanics and allows players to control game objects by use of willpower. We present the design rationale behind the interaction in Invisible Eni and consider the design implications of using pupil measurements in the interface. We discuss limitations for pupil based interaction and provide suggestions for using pupil size as an active input modality.

Reference:
Ekman, I. M., Poikola, A. W., and Mäkäräinen, M. K. (2008) Invisible eni: using gaze and pupil size to control a game. In CHI '08 Extended Abstracts on Human Factors in Computing Systems, CHI '08. ACM, New York, NY, 3135-3140.
Link: http://doi.acm.org/10.1145/1358628.1358820

Voluntary pupil size change as control in eyes only interaction

Abstract:
We investigate consciously controlled pupil size as an input modality. Pupil size is affected by various processes, e.g., physical activation, strong emotional experiences and cognitive effort. Our hypothesis is that given continuous feedback, users can learn to control pupil size via physical and psychological self-regulation. We test it by measuring the magnitude of self evoked pupil size changes following seven different instructions, while providing real time graphical feedback on pupil size. Results show that some types of voluntary effort affect pupil size on a statistically significant level. A second controlled experiment confirms that subjects can produce pupil dilation and construction on demand during paced tasks. Applications and limitations to using voluntary pupil size manipulation as an input modality are discussed.

Reference:
Ekman, I., Poikola, A., Mäkäräinen, M., Takala, T., and Hämäläinen, P. (2008) Voluntary pupil size change as control in eyes only interaction. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications - ETRA '08. ACM, New York, NY, 115-118.
Link: http://doi.acm.org/10.1145/1344471.1344501

Snap Clutch, a Moded Approach to Solving the Midas Touch Problem

Abstract:
This paper proposes a simple approach to an old problem, that of the 'Midas Touch'. This uses modes to enable different types of mouse behavior to be emulated with gaze and by using gestures to switch between these modes. A light weight gesture is also used to switch gaze control off when it is not needed, thereby removing a major cause of the problem. The ideas have been trialed in Second Life, which is characterized by a feature-rich of set of interaction techniques and a 3D graphical world. The use of gaze with this type of virtual community is of great relevance to severely disabled people as it can enable them to be in the community on a similar basis to able-bodied participants. The assumption here though is that this group will use gaze as a single modality and that dwell will be an important selection technique. The Midas Touch Problem needs to be considered in the context of fast dwell-based interaction. The solution proposed here, Snap Clutch, is incorporated into the mouse emulator software. The user trials reported here show this to be a very promising way in dealing with some of the interaction problems that users of these complex interfaces face when using gaze by dwell.

Reference:
H.O. Istance, R Bates, A. Hyrskykari and S. Vickers (2008) Snap Clutch, a Moded Approach to Solving the Midas Touch Problem. Proceedings of the 2008 symposium on Eye tracking research & applications ETRA '08, ACM Press, Savannah, March 2008.

See also the related article on "Eye-tracking interface means gamers' looks can kill" in New Scientist Tech, 5 May 2008.
http://technology.newscientist.com/article/dn13830-eyetracking-interface-means-gamers-looks-can-kill.html

Evaluation of Real-time Eye Gaze Logging by a 3D Game Engine

Abstract:
Human Computer Interaction studies of visual attention in dynamic 3D computer gameplay can be greatly facilitated by automated gaze object logging implemented by integration of eye gaze tracking systems with game engines. This verification study reports the spatial and temporal accuracy of such an integrated system.

Reference:
Charlotte Sennersten and Craig Lindley (2008) Evaluation of Real-time Eye Gaze Logging by a 3D Game Engine. In Proc. 12th IMEKO TC1 & TC7 Joint Symposium on Man Science & Measurement, Annecy, 2008.
Link: http://www.bth.se/fou/forskinfo.nsf/alfs/14fc0ac35cfea843c1257464003e9022

A Psychophysiological Logging System for a Digital Game Modification

Abstract:
This student thesis intends to facilitate cognitive experiments for gameplay experience studies. To achieve this a psychophysiological logging framework was developed, which automatically reports the occurrence of specific game events to a log file and to the parallel port. Via the parallel port the communication with psychophysiological systems is possible.

Thus, psychophysiological data can be correlated with in-game data in real time. In addition, this framework is able to log viewed game objects via an eye tracker integration. This gives some information on how certain game elements affect the player's attention. For the development of this system the Source SDK, the game engine of Half-Life 2, has been used. Consequently, custom-built Half-Life 2 levels had to be developed, which are suitable for cognitive experiments. In this context, tools for level editing will be introduced.

This thesis shapes the basis for further research work in the area of psychophysiological software development and is intended to facilitate this for future scholars facing these issues.

Reference:
Stellmach, S. (2007). A Psychophysiological Logging System for a Digital Game Modification. Technical Bachelor's Report.
Link: http://www.gamecareerguide.com/thesis/080527_stellmach.pdf

A Framework for Psychophysiological Data Acquisition in Digital Games

Abstract:
In order to rapidly develop digital games for psychophysiological experiments, a coherent and flexible development environment is required. Something that allows researchers to design their experiments, build the stimulus game and easily integrate all required data acquisition functionality into it.

This thesis shows the design and implementation of such a framework. Methods for gathering player-related data are compared to establish a theoretical foundation for the framework. The logging framework is implemented as a set of Torque X components and an example game is developed in order to demonstrate the framework and the different logging components.

Reference:
Sasse, D. (2008). A Framework for Psychophysiological Data Acquisition in Digital Games. Master's Thesis.
Link: http://www.gamecareerguide.com/thesis/080520_sasse.pdf

Keeping an Eye on the Game: Eye Gaze Interaction with Massively Multiplayer Online Games and Virtual Communities for Motor Impaired Users

Abstract:
Online virtual communities are becoming increasingly popular both within the able-bodied and disabled user communities. These games assume the use of keyboard and mouse as standard input devices, which in some cases is not appropriate for users with a disability. This paper explores gaze-based interaction methods and highlights the problems associated with gaze control of online virtual worlds. The paper then presents a novel 'Snap Clutch' software tool that addresses these problems and enables gaze control. The tool is tested with an experiment showing that effective gaze control is possible although task times are longer. Errors caused by gaze control are identified and potential methods for reducing these are discussed. Finally, the paper demonstrates that gaze driven locomotion can potentially achieve parity with mouse and keyboard driven locomotion, and shows that gaze is a viable modality for game based locomotion both for able-bodied and disabled users alike.

Reference:
Vickers, S., Istance, H., Hyrskykari, A. Ali, N., and Bates, R. (2008). Keeping an Eye on the Game: Eye Gaze Interaction with Massively Multiplayer Online Games and Virtual Communities for Motor Impaired Users. Proceedings of the 7th International Conference on Disability, Virtual Reality and Associated Technologies; ICDVRAT 2008, Maia, Portugal, 8th-10th September 2008
Link: http://www.icdvrat.reading.ac.uk/2008/papers/ICDVRAT2008_S04_N05_Vickers_Istance_et_al.pdf
(according to the ICDVRAT2008 web page, the link will activate on 1 March 2009)

Gaze and voice based game interaction: the revenge of the killer penguins

Reference:
Wilcox, T., Evans, M., Pearce, C., Pollard, N., and Sundstedt, V. 2008. Gaze and voice based game interaction: the revenge of the killer penguins. In ACM SIGGRAPH 2008 Posters (Los Angeles, California, August 11 - 15, 2008). SIGGRAPH '08. ACM, New York, NY, 1-1.
Link: http://doi.acm.org/10.1145/1400885.1400972

Gaze vs. Mouse in Games: The Effects on User Experience

Abstract:
The possibilities of eye-tracking technologies in educational gaming are seemingly endless. The question we need to ask is what the effects of gaze-based interaction on user experience, strategy during learning and problem solving are. In this paper we evaluate the effects of two gaze based input techniques and mouse based interaction on user experience and immersion. In a between-subject study we found that although mouse interaction is the easiest and most natural way to interact during problem-solving, gaze-based interaction brings more subjective immersion. The findings provide a support for gaze interaction methods into computer-based educational environments.

Reference:
Gowases, T., Bednarik, R., and Tukiainen, M. (2008) Gaze vs. Mouse in Games: The Effects on User Experience. In Proceedings of the International Conference on Computers in Education, ICCE 2008, pp. 773-777.
Link: http://www.apsce.net/icce2008/papers/ICCE2008-paper280.pdf
See also the video presentation of the paper at http://www.youtube.com/watch?v=HeHh3y3Z4Do

EyeMote - Towards Context-Aware Gaming Using Eye Movements Recorded from Wearable Electrooculography

Abstract:
Physical activity has emerged as a novel input modality for so-called active video games. Input devices such as music instruments, dance mats or the Wii accessories allow for novel ways of interaction and a more immersive gaming experience. In this work we describe how eye movements recognised from electrooculographic (EOG) signals can be used for gaming purposes in three different scenarios. In contrast to common video-based systems, EOG can be implemented as a wearable and light-weight system which allows for long-term use with unconstrained simultaneous physical activity. In a stationary computer game we show that eye gestures of varying complexity can be recognised online with equal performance to a state-of-the-art video-based system. For pervasive gaming scenarios, we show how eye movements can be recognised in the presence of signal artefacts caused by physical activity such as walking. Finally, we describe possible future context-aware games which exploit unconscious eye movements and show which possibilities this new input modality may open up.

Reference:
Bulling, A., Roggen, D., and Tröster, G. (2008) EyeMote - Towards Context-Aware Gaming Using Eye Movements Recorded from Wearable Electrooculography. In the Second International Conference on Fun and Games, LNCS 5294. pp 33-45. Springer.
Link: http://dx.doi.org/10.1007/978-3-540-88322-7_4

Measuring and defining the experience of immersion in games

Abstract:
Despite the word's common usage by gamers and reviewers alike, it is still not clear what immersion means. This paper explores immersion further by investigating whether immersion can be defined quantitatively, describing three experiments in total. The first experiment investigated participants’ abilities to switch from an immersive to a non-immersive task. The second experiment investigated whether there were changes in participants’ eye movements during an immersive task. The third experiment investigated the effect of an externally imposed pace of interaction on immersion and affective measures (state anxiety, positive affect, negative affect). Overall the findings suggest that immersion can be measured subjectively (through questionnaires) as well as objectively (task completion time, eye movements). Furthermore, immersion is not only viewed as a positive experience: negative emotions and uneasiness (i.e. anxiety) also run high.

Reference:
Jennett, C., Cox, A.L., Cairns, P., Dhoparee, S., Epps, A., Tijs, T., and Walton, A. (2008) Measuring and defining the experience of immersion in games. Int. J. Hum.-Comput. Stud. 66(9), 641-661.
Link: http://dx.doi.org/10.1016/j.ijhcs.2008.04.004



Papers II: Tracking Gaze in Virtual Environments

Computational mechanisms for gaze direction in interactive visual environments

Abstract:
Next-generation immersive virtual environments and video games will require virtual agents with human-like visual attention and gaze behaviors. A critical step is to devise efficient visual processing heuristics to select locations that would attract human gaze in complex dynamic environments. One promising approach to designing such heuristics draws on ideas from computational neuroscience. We compared several such heuristics with eye movement recordings from five observers playing video games, and found that heuristics which detect outliers from the global distribution of visual features were better predictors of human gaze than were purely local heuristics. Heuristics sensitive to dynamic events performed best overall. Further, heuristic prediction power differed more between games than between different human observers. Our findings suggest simple neurally inspired algorithmic methods to predict where humans look while playing video games.

Reference:
Peters, R. J., & Itti, L. (2006). Computational mechanisms for gaze direction in interactive visual environments. Proceedings of the ACM Eye Tracking Research and Applications (ETRA) Symposium, 2006. 20-27.
Link: http://ilab.usc.edu/publications/doc/Peters_Itti06etra.pdf

Towards eye based virtual environment interaction for users with high-level motor disabilities

Abstract:
An experiment is reported which extends earlier work on the enhancement of eye pointing in 2D environments through the addition of a zoom facility, to its use in virtual 3D environments using a similar enhancement. A comparison between hand pointing and eye pointing without any enhancement shows a performance advantage for hand based pointing. However, the addition of a 'fly' or 'zoom' enhancement increases both eye and hand based performance, and reduces greatly the difference between these devices. Initial attempts at 'intelligent' fly mechanisms and further enhancements are evaluated.

Reference:
Bates, R., & Istance, H. O. (2005). Towards eye based virtual environment interaction for users with high-level motor disabilities. Special Issue of International Journal of Disability & Human Development: The International Conference Series on Disability, Virtual Reality and Associated Technologies, Vol. 4(3).
Link: http://www.icdvrat.rdg.ac.uk/2004/papers/S09_N2_Bates_Istance_ICDVRAT2004.pdf

Gaze- vs. Hand-Based Pointing in Virtual Environment

Abstract:
This paper contributes to the nascent body of literature on pointing performance in Virtual Environments (VEs), comparing gaze- and hand-based pointing. Contrary to previous findings, preliminary results indicate that gaze-based pointing is slower than hand-based pointing for distant objects.

Reference:
Cournia, N., Smith, J.D., & Duchowski, A.T. (2003). Gaze- vs. Hand-Based Pointing in Virtual Environment, in Proc. SIGCHI 2003 (Short Talks & Interactive Posters), April 5-10, 2003, Ft. Lauderdale, FL.
Link: http://andrewd.ces.clemson.edu/research/vislab/docs/chi03-short.pdf

Evaluating gaze-contingent level of detail rendering of virtual environments using visual search

Abstract:
Level of detail rendering reduces the geometric complexity of objects in virtual reality in order to reduce the computational load on the rendering system. Although the resultant increase in rendering speed is desirable, the behavioral consequences of these techniques for humans performing realistic tasks in complex virtual environments are not well understood. The current study examines the behavior of human observers in virtual environments rendered using a gaze-contingent level of detail criterion. This method takes advantage of the fact that the visual sensitivity of the human visual system is greater at the point of gaze than in the periphery by rendering objects in the periphery with less detail than objects at the point of gaze. In the experiment, participants performed a "virtual search" task, i.e. a visual search task where participants are required to pan the viewport to find a target object among distractors in a virtual environment. Gaze-contingent rendering was employed where the level of detail dropped continuously from the point of gaze. The time to detect and localize the target was measured as a function of the rate of decline in visual detail. Frame rates were allowed to increase with decreasing detail, thus keeping computational load approximately constant. Reaction times to detect the target increased with decreasing detail while reaction times to localize the target decreased with decreasing detail. These results suggest that reduced detail impedes target identification while the increased frame rates due to the reduction in detail facilitates interaction with virtual environments. Overall, these results indicate that the behavioural performance costs of gaze-contingent level of detail techniques can be offset by the behavioural performance gains due to increased rendering speed.

Reference:
Parkhurst, D., Law, I., & Niebur, E. (2001). Evaluating gaze-contingent level of detail rendering of virtual environments using visual search. In Lab Technical Report 2001-02, 1-6.
Link: http://cnslab.mb.jhu.edu/pubs/Parkhurst_etal01c.pdf

Interacting with Eye Movements in Virtual Environments

Abstract:
Eye movement-based interaction offers the potential of easy, natural, and fast ways of interacting in virtual environments. However, there is little empirical evidence about the advantages or disadvantages of this approach. We developed a new interaction technique for eye movement interaction in a virtual environment and compared it to more conventional 3-D pointing. We conducted an experiment to compare performance of the two interaction types and to assess their impacts on spatial memory of subjects and to explore subjects' satisfaction with the two types of interactions. We found that the eye movement based interaction was faster than pointing, especially for distant objects. However, subjects' ability to recall spatial information was weaker in the eye condition than the pointing one. Subjects reported equal satisfaction with both types of interactions, despite the technology limitations of current eye tracking equipment.

Reference:
Tanriverdi, V., & Jacob, R. J. K. (2000). Interacting with Eye Movements in Virtual Environments. In CHI '00 Proceedings, ACM, 265-272.
Link: http://www.cs.tufts.edu/~jacob/papers/chi00.tanriverdi.pdf

Gaze Interaction with Virtual On-Line Communities: Levelling the Playing Field for Disabled Users

Reference:
R Bates, H.O. Istance and S. Vickers (2008) Gaze Interaction with Virtual On-Line Communities: Levelling the Playing Field for Disabled Users. In Proceedings of the 4th Cambridge Workshop on Universal Access and Assistive Technology (CWUAAT), University of Cambridge, 13th-16th April 2008.
Link: http://www.cse.dmu.ac.uk/~svickers/pdf/CWUAAT%202008.pdf

User Performance of Gaze-based Interaction with On-line Virtual Communities

Abstract:
We present the results of an investigation into gaze-based interaction techniques with on-line virtual communities. The purpose of this study was to gain a better understanding of user performance with a gaze interaction technique developed for interacting with 3D graphical on-line communities and games. The study involved 12 participants each of whom carried out 2 equivalent sets of 3 tasks in a world created in Second Life. One set was carried out using a keystroke and mouse emulator driven by gaze, and the other set was carried out with the normal keyboard and mouse.. The study demonstrates that subjects were easily able to perform a set of tasks with eye gaze with only a minimal amount of training. It has also identified the causes of user errors and the amount of performance improvement that could be expected if the causes of these errors can be designed out.

Reference:
Istance, H., Hyrskykari, A., Vickers, S., and Ali, N. (2008) User Performance of Gaze-based Interaction with On-line Virtual Communities. In Proceedings of the 4th Conference on Communication by Gaze Interaction; COGAIN 2008, Prague, CZ, 2nd-3rd September, pp. 28-32.
Link: [[../../cogain2008/COGAIN2008-Proceedings.pdf http://www.cogain.org/cogain2008/COGAIN2008-Proceedings.pdf]]

Gazing into a Second Life: Gaze-Driven Adventures, Control Barriers, and the Need for Disability Privacy in an Online Virtual World

Abstract:
Online virtual worlds such as Second Life and World of Warcraft offer users the chance to participate in potentially limitless virtual worlds, all via a standard desktop pc, mouse and keyboard. This paper addresses some of the interaction barriers and privacy concerns that people with disabilities may encounter when using these worlds, and introduces an avatar Turing test that should be passed for worlds to be accessible for all users. The paper then focuses on the needs of high-level motor disabled users who may use gaze control as an input modality for computer interaction. A taxonomy and survey of interaction are introduced, and an experiment in gaze based interaction is conducted within these virtual worlds. The results of the survey highlight the barriers where people with disabilities cannot interact as efficiently as able-bodied users. Finally, the paper discusses methods for enabling gaze based interaction for high-level motor disabled users and calls for game designers to consider disabled users when designing game interfaces.

Reference:
Vickers, S., Bates, R., Istance, H. (2008). Gazing into a Second Life: Gaze-Driven Adventures, Control Barriers, and the Need for Disability Privacy in an Online Virtual World. Proceedings of the 7th International Conference on Disability, Virtual Reality and Associated Technologies; ICDVRAT 2008, Maia, Portuagal, 8th-10th September 2008.
Link: http://www.icdvrat.reading.ac.uk/2008/papers/ICDVRAT2008_S04_N04_Vickers_Bates_et_al.pdf
(according to the ICDVRAT2008 web page, the link will activate on 1 March 2009)



Relevant Websites

COGAIN - Leisure Applications

Gaze-controlled games and leisure applications available via the COGAIN web portal
Link: [[../../downloads/leisure-applications.1 http://www.cogain.org/downloads/leisure-applications]]

SpecialEffect GameBase

The SpecialEffect GameBase provides links to accessible computer games. A group of youngsters, all with disabilities, reviewed these games and helped in the creation of the website. It is hoped that the information on this site will help other young players to find games that are suitable for their interests and abilities. Each game review provides information about how a game is controlled, how fast it is, how much it is likely to cost etc., which saves our players from having to spend time/money on games that would not be suitable for them in the first place. Our 'Comments' section provides a format where players can keep adding tips and tricks for each other.
Link: http://www.specialeffect.org.uk/pages/gamebase.htm

Game Accessibility

All about game accessibility. For ALL disabled and interested gamers. Contains contributions to gaze controlled computer games.
See especially:
Resources (papers, videos etc.): http://www.game-accessibility.com/index.php?pagefile=papers
Forum: http://www.game-accessibility.com/forum/index.php

OneSwitch.Org

A resource of fun ideas and 'assistive technology' aimed at moderate to severely learning/physically disabled people.
Link: http://www.oneswitch.org.uk/

Design Tips For: Eye Tracker Games
Link: http://switchgaming.blogspot.com/2008/08/design-tips-for-eye-tracker-games.html

Head, Mouth and Eye Controls
At oneswitch.org you can find a list detailing a number of different styles of head, mouth and eye operated controllers. Most of these are for PCs and Apple computers, but there are alternatives for games consoles too. Some of these devices are very expensive, so it is always worth trying to track down some way of trying them out before you buy.
See http://www.oneswitch.org.uk/1/AGS/AGS-head.htm.

Software Downloads
Oneswitch.org provides more than 70 one switch games of different types (adventure, arcade classics, platformers, puzzle & skill games, race games, shoot-em-ups…) for free download. In all likelihood a large number of these are suitible for gaze control. Alongside this you will find articles, instructions and more at
http://www.oneswitch.org.uk/2library.htm
http://www.oneswitch.org.uk/4/games/0index.htm

levelgames.net

A website focusing on switch games designed to be widely accessible for players who have Muscular Dystrophy, Cerebral Palsy, Spinal Injury, Head Injury or other physical disabilities.
Link: http://www.levelgames.net/

Games designed for the MyTobii Eye Tracking System (by Oleg Špakov)

This page contains a list of applications developed to run specifically in the MyTobii environment. Each application registeres itself on installation to be recognized by MyTobii as MyTobii Partner Application.
Link: http://www.cs.uta.fi/~oleg/mytobii.html

World of Warcraft Percept Interface (by Oleg Komogortsev)

Oleg Komogortsev created an interface that allows to play computer games using gaze control without the use of mouse or keyboard. The interface was tested with the virtual reality game World of Warcraft.
Link: http://www.cs.kent.edu/~okomogor/wowpercept/wowpercept.htm

Adventure Game Studio

Adventure Game Studio (AGS for short) allows you to create your own point-and-click adventure games, similar to the early 90's Sierra and Lucasarts adventures. It consists of an easy-to-use development environment, and run-time engine. AGS is free. You need no programming experience to make a game using AGS - setting most game options is just a matter of point-and-click (though scripting is of course available if you prefer).
Link: http://www.adventuregamestudio.co.uk/

Entertainment Software designed for an EOG based Eye Tracking System: EagleEyes

This website contains various applications software designed to be run with EagleEyes and Camera Mouse and other similar systems. (Includes Games, Spell and Speak, System)
Link: http://www.bc.edu/schools/csom/eagleeyes/downloads.html

List of open source games

Open source games are computer games assembled using open-source software and open content. These games are open to modifications, such as implementing gaze control.
Link: http://en.wikipedia.org/wiki/List_of_open_source_games

Game Accessibility Suite

Code library and utilities to enhance accessibility to existing and future games.
Link: http://sourceforge.net/projects/gameaccess/

Retro Remakes Forum

Forum on game accessibility containing threads on eye- and head control.
Link: http://www.retroremakes.com/forum2/forumdisplay.php?f=84

Eye Trackers

Catalogue of currently available eye trackers for interactive applications.
Link: [http://www.cogain.org/wiki/Eye_Trackers]

Gaze-aware Space Vampires (by Chris Schmelzle)

Chris Schmelzle trialled how adding information from eye movements to improve the game's artificial intelligence can enhance the gaming experience -- when the game enemies know where the player is looking at.
Link: http://www.cschmelzle.net/eye.html



Multimedia

YouTube video clips on gaze & games

Organisations

SpecialEffect

SpecialEffect is a charitable organisation dedicated to helping ALL young people with disabilities to enjoy computer games. For these children, the majority of computer games are simply too quick or too difficult to play, and we can help them and their parents to find out which games they CAN play, and how to adapt those games that they can't.
Link: http://www.specialeffect.org.uk/

IGDA (International Game Developers Association) Game Accessibility Special Interest group (GA-SIG)

The GA-SIG was formed to help the game community strive towards creating mainstream games that are universally accessible to all, regardless of disability.
Link: http://www.igda.org/wiki/index.php/Game_Accessibility_SIG

Pin Interactive

Game Accessibility Development company
Link: http://www.pininteractive.com/


Please email any additions or corrections to office (at) cogain (dot) org

NOTE: This web resource is part of the COGAIN Deliverable D4.5 Online information resources on how to use gaze for the control of selected games by Michael Heubner (Technical University of Dresden), Fiona Mulvey (Technical University of Dresden) and Päivi Majaranta (University of Tampere). Thanks to: Faten Ahmed (Technical University of Dresden), Oleg Špakov (University of Tampere) and Barrie Ellis (Oneswitch.org.uk). The original version was prepared in August 2007 and delivered in October 2007. New material is added as it appears.

See also the more general Gaze Interaction Bibliography