Eye-hand coordination on mobile devices

Project Leader

Xinhui Jiang, Ph.D. Student, Center for Human-Engaged Computing

Mobile devices with touch screen is widely applied while facing challenges including dis-match of target size and fingertip size (i.e. fat finger problem), and lack of haptic feedback. Current solution for this problem is focusing on the modeling of touch behavior, which overlooked the process for conducting the tasks. For enhancing the understanding of user behavior on touchscreen and build a prediction system leveraging eye movement data, we propose the research on touchscreen mobile devices with the observation of finger and eye movement. The properties of the device and their influences on the performance are fully considered. Results of this project can not only be a reference for user interface design on mobile devices, but also contribute to the modeling of the human eye-hand movement in the target-directed task.

Keyword

Eye movement, motion tracking, mobile device, eye-hand coordination.