Eyeing a real-time human-computer interface to assist those with motor disabilities Article

Sesin, A, Adjouadi, M, Ayala, M et al. (2008). Eyeing a real-time human-computer interface to assist those with motor disabilities . 27(3), 19-25. 10.1109/MPOT.2007.914733

cited authors

  • Sesin, A; Adjouadi, M; Ayala, M; Cabrerizo, M; Barreto, A

abstract

  • An adaptive real-time human-computer interface (HCI) has been developed as an assertive technology tool for persons with severe motor disabilities to harness the power of computers and access the variety of resources that are available to all. The novelty of the propose HCI system is that it adapts to different and potentially changing jitter characteristics of each specific user through the configuration and training of an artificial neural network (ANN). A number of HCIs have relied on the integration of eye-gaze tracking (EGT) systems as one possibility to provide for user interaction with the computer through eye movement. The EGT-based HCI is based on a remote eye-gaze setup, which is less passive in contrast to the head-mounted version that freeing the user from any physical constraint. The EGT system in the HCI reads and sends eye-gaze position data to the stimulus computer, where the data is translated into display coordinates to guide the position of the mouse pointer.

publication date

  • January 1, 2008

Digital Object Identifier (DOI)

start page

  • 19

end page

  • 25

volume

  • 27

issue

  • 3