Low Vision Stroke-Gesture Dataset

This dataset contains 6,562 stroke-gesture samples collected from 54 participants, from which 27 participants with low vision.
The dataset is companion to our IEEE Pervasive 2018 paper Vatavu, Gheran, and Schipor (2018) in which we report experimental results on stroke-gestures performed by people with low vision and without visual impairments. For example, we found that some aspects of gesture articulation, such as gesture length or size, are minimally impacted by the impairment, while other aspects, such as production times, reveal suboptimal visuomotor routines of eye-hand coordination. For more results, please see our paper.

Resources

We release our gesture dataset to encourage further studies, research, and development to understand stroke-gesture input on touchscreen devices for users with visual impairments. Our dataset and C# code are freely available to download and use for research purposes under this license.
If you find our dataset and/or code useful for your work, please let us know.
If you use our dataset and/or code to report results in scientific publications, please reference the Vatavu et al. (2018) publication that introduced these resources. Thanks!

Publication

Contact

For any information or suggestion regarding this web page, the dataset, or the source code, please contact Prof. Radu-Daniel Vatavu.