An adaptive real-time human-computer interface (HCI) has been developed as an assertive technology tool for persons with severe motor disabilities to harness the power of computers and access the variety of resources that are available to all. The novelty of the propose HCI system is that it adapts to different and potentially changing jitter characteristics of each specific user through the configuration and training of an artificial neural network (ANN). A number of HCIs have relied on the integration of eye-gaze tracking (EGT) systems as one possibility to provide for user interaction with the computer through eye movement. The EGT-based HCI is based on a remote eye-gaze setup, which is less passive in contrast to the head-mounted version that freeing the user from any physical constraint. The EGT system in the HCI reads and sends eye-gaze position data to the stimulus computer, where the data is translated into display coordinates to guide the position of the mouse pointer.