Identifying facial gestures to emulate a mouse: navigation application on Facebook.
A system is presented for emulate a mouse from the movement of the head and eyelids. The position of the head is used for controlling the horizontal and vertical displacement of the cursor, and the closing of the eyelids to activate the click of the right and left buttons. The system includes zoom, navigation shortcuts, vertical scrollbars and menus activation according to the cursor position; that enhance the functionality and facilitate handling applications. The proposed solution eliminates the restriction of direct contact with the mouse, and empowers people with motor disabilities in the upper extremities to interact with the computer. The mouse emulator can also be used by people without limitations to expand command instructions. The system was tested in navigation on social network Facebook, where an average speed of 382 pixels / s was obtained, with an average accuracy of 22 pixels for the X axis and 17 pixels for the Y axis. However, after user interaction with the interface, improvements of 23% and 37% were observed, in execution time and location accuracy, respectively. Click activation by temporary location over a menu option had a performance of 100%; while the right and left clicks by eyelid closing had a performance of 93% and 92%, respectively. Finally, the surveys showed high satisfaction about the proposed interface during user interaction with Facebook.