Gesture Interfaces for Visually-Impairing Interaction Contexts

Project summary
Project no: PN-II-RU-TE-2014-4-1187; Contract no: 47/01.10.2015
Principal Investigator: Radu-Daniel Vatavu
Funded by UEFISCDI, Romania
Running period: October 2015 - September 2017 (24 months)

Abstract
In this project, we design gesture interface technology more accessible to people with visual impairments as well as to people without impairments that occasionally find themselves in visually-impairing situations, i.e., when they are not able to have a clear, direct look at the smartphone, yet interaction is mandatory. Our goal is to understand the effects of visual impairment on gesture performance on touch-sensitive mobile devices and, therefore, to inform the development of new algorithms to recognize gestures for various visually-impairing conditions, and new interaction techniques to help mobile users overcome their visual impairment, either physiological or situational.

Touch input on today's smart devices is entirely visual and, consequently, requires a direct view to locate, select, and control on-screen objects. Eye conditions that affect vision clarity or the field of view determine less efficient and, possibly, ineffective interaction.



From left to right: (1) clear sight of a tablet as experienced by a person with full sight; (2) visual acuity loss, determined for example by hyperopia; (3) peripheral vision loss, caused by moderate glaucoma; (4) central vision loss, caused by moderated age-related macular degeneration; (5) color blindness. Note: images were produced using the Vision and Hearing Impairment Simulator (images produced and included with permission, courtesy of Sam Waller).

Concrete objectives
  1. Understand the ways in which visually-impairing situations, either physiological or situational, affect touch input on mobile devices, such as smartphones or tablets.
  2. Develop efficient and robust algorithms for recognizing touch gestures performed in visually-impairing contexts.
  3. Design and implement assistive touch input techniques for efficient item selection and text entry on mobile devices in visually-impairing contexts.

Expected results
  1. Recognition algorithms for touch and free-hand gestures performed in visually-impairing contexts.
  2. Interaction techniques for target selection and symbol entry on smart devices with appropriate feedback to assist touch input.
  3. Gesture datasets collected from people with low vision and from people without visual impairments in visually-impairing situations. Accompanying software for gesture data collection in experimental settings and accompanying methodology for gesture analysis.
  4. Scientific publications in high-quality journals and conferences (at least 5 major publications), research and progress reports.

Team
  1. Radu-Daniel Vatavu, Principal Investigator
  2. Doina-Maria Schipor, Postdoctoral researcher (Psychology)
  3. Ionela Rusu, Postdoctoral researcher (Computer Science)
  4. Gabriel Cramariuc, PhD student (2015-2017) & Postdoctoral researcher (2017) (Computer Science)
  5. Bogdan-Florin Gheran, PhD student (Computer Science)

Publications
  1. Radu-Daniel Vatavu, Bogdan-Florin Gheran, Maria Doina Schipor. (2018). The Impact of Low Vision on Touch Gesture Articulation on Mobile Devices. IEEE Pervasive Computing 17(1). IEEE, 27-37
    DOI | ISI WOS:000427706000005
    IF: 3.022 | 5-Year IF: 2.916 (JCR 2017)
  1. Radu-Daniel Vatavu. (2017). Visual Impairments and Mobile Touchscreen Interaction: State-of-the-Art, Causes of Visual Impairment, and Design Guidelines. International Journal of Human-Computer Interaction 33 (6). Taylor & Francis, 486-509
    PDF | DOI | ISI WOS:000400857000006
    IF: 1.118 | 5-Year IF: 1.396 (JCR 2016)
  1. Maria Doina Schipor, Radu-Daniel Vatavu. (2017). Neurobiological and Neurocognitive Models of Vision for Touch Input on Mobile Devices. In Proceedings of EHB '17, the 6th IEEE International Conference on e-Health and Bioengineering. (Sinaia, Romania, June 2017). IEEE Press, 353-356
    PDF | DOI | ISI WOS:000445457500089
  1. Maria Doina Schipor, Radu-Daniel Vatavu. (2017). Coping Strategies of People with Low Vision for Touch Input: A Lead-in Study. In Proceedings of EHB '17, the 6th IEEE International Conference on e-Health and Bioengineering. (Sinaia, Romania, June 2017). IEEE Press, 357-360
    PDF | DOI | ISI WOS:000445457500090
  1. Luis A. Leiva, Daniel Martín-Albo, Radu-Daniel Vatavu. (2017). Synthesizing Stroke Gestures Across User Populations: A Case for Users with Visual Impairments. In Proceedings of CHI '17, the 35th ACM Conference on Human Factors in Computing Systems. (Denver, Colorado, USA, May 2017). New York: ACM Press, 4182-4193
    ACCEPTANCE RATE: 25% | ARC CORE A*
    PDF | DOI | ISI WOS:000426970504006
  1. Radu-Daniel Vatavu. (2017). Improving Gesture Recognition Accuracy on Touch Screens for Users with Low Vision. In Proceedings of CHI '17, the 35th ACM Conference on Human Factors in Computing Systems. (Denver, Colorado, USA, May 2017). New York: ACM Press, 4667-4679
    ACCEPTANCE RATE: 25% | ARC CORE A*
    PDF | DOI | ISI WOS:000426970504048
  1. Yosra Rekik, Radu-Daniel Vatavu, Laurent Grisoni. (2017). Spontaneous Gesture Production Patterns on Multi-touch Interactive Surfaces. In C. Anslow et al. (Eds.) Collaboration Meets Interactive Spaces. Springer International Publishing Switzerland, 33-46.
    DOI
  1. Radu-Daniel Vatavu, Annette Mossel, Christian Schönauer. (2016). Digital Vibrons: Understanding Users' Perceptions of Interacting with Invisible, Zero-Weight Matter. In Proceedings of MobileHCI '16, the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services. (Florence, Italy, September 2016). New York: ACM Press, 217-226
    ACCEPTANCE RATE: 23.9% (57/238)
    PDF | DOI | YOUTUBE
  1. Bogdan-Florin Gheran. (2016). Design and Engineering of Software Applications for Touch Input Collection on Mobile Devices. In Proceedings of the 11th International Conference on Virtual Learning (Craiova, Romania, October 2016). 323-328
    PDF | ISI WOS:000444941400042
  1. Martez E. Mott, Radu-Daniel Vatavu, Shaun K. Kane, Jacob O. Wobbrock. (2016). Smart Touch: Improving Touch Accuracy for People with Motor Impairments with Template Matching. In Proceedings of CHI '16, the 34th ACM Conference on Human Factors in Computing Systems. (San Jose, California, USA, May 2016). New York: ACM Press, 1934-1946
    ACCEPTANCE RATE: 23.2% (565/2435); ARC CORE A*
    PDF | DOI | YOUTUBE | ISI WOS:000380532901090
    BEST PAPER AWARD
  1. Radu-Daniel Vatavu, Jacob O. Wobbrock. (2016). Between-Subjects Elicitation Studies: Formalization and Tool Support. In Proceedings of CHI '16, the 34th ACM Conference on Human Factors in Computing Systems. (San Jose, California, USA, May 2016). New York: ACM Press, 3390-3402
    ACCEPTANCE RATE: 23.2% (565/2435); ARC CORE A*
    PDF | DOI | ISI WOS:000380532903037
  1. Radu-Daniel Vatavu. (2016). Tools for Designing for Home Entertainment: Gesture Interfaces, Augmented Reality, and Smart Spaces. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '16). ACM, New York, NY, USA, 1003-1006
    PDF | DOI
  1. Bogdan-Florin Gheran, Gabriel Cramariuc, Ionela Rusu, Elena-Gina Craciun. (2016). Tools for Collecting Users' Touch and Free-Hand Gestures on Mobile Devices in Experimental Settings. In Proceedings of the 13th International Conference on Development and Application Systems (May 2016). IEEE Press, 248-253.
    PDF | | DOI | ISI WOS:000383222200040
  1. Dorin-Mircea Popovici, Radu-Daniel Vatavu, Mihai Polceanu. 2015. GRASPhere: a prototype to augment indirect touch with grasping gestures. In Proceedings of the 14th International Conference on Mobile and Ubiquitous Multimedia (MUM '15). ACM, New York, NY, USA, 350-354.
    PDF | | DOI

Media


Project reports (in Romanian)
  1. Progress reports: 2015 PDF , 2016 PDF , and 2017 PDF
  2. Scientific reports: 2015 PDF and 2016 PDF
  3. Summary scientific report (2015-2017): PDF

Other resources (software, algorithms, and datasets)
  1. Software for gesture data collection in experimental settings
    RAR | License
  2. Touch gesture recognition algorithm (pseudocode)
    PDF
  3. Free-hand gesture recognition algorithm (pseudocode)
    PDF
  4. Gesture datasets. The following datasets were collected:
    • 26,625 touch tap gestures for static targets from 54 participants (27 with visual impairments)
    • 24,343 touch tap gestures for moving targets from 54 participants (27 with visual impairments)
    • 11,334 touch tap gestures for dense targets from 54 participants (27 with visual impairments)
    • 6,562 stroke gestures from 54 participants (27 with visual impairments)
    • 3,600 free-hand gestures from 30 participants (10 with visual impairments)
    • 11,383 touch tap gestures from 11 participants and 2400 stroke gestures from 10 participants in situationally-impairing conditions
    Demographic description of participants with visual impairments PDF
    We will make these datasets public once the corresponding publications are accepted and published.
  5. Demos for the 1-2-Text and 1-2-3-Text soft keyboards are available here and here. Please note that the two keyboards work with touch input only.