MotorSkill: Effective Gesture Interactions with Touch Surfaces for Motor Impairment Conditions

Project summary
Project no.: PN-III-P2-2.1-PED-2016-0688; Contract no.: 209PED/2017
Principal Investigator: Radu-Daniel Vatavu
Funded by UEFISCDI, Romania
Funding scheme: PNIII P2 - Increasing economic competitiveness through research, development and innovation (PED)
Running period: August 2017 - December 2018 (17 months)

Abstract

People with upper body motor impairments (see image below, left) encounter many challenges when employing one of today's dominant forms of interaction with modern computing technology, i.e., touch and gesture input, and, unlike people without motor impairments (image on the right), they need to adopt workaround strategies to be able to access content on touchscreen devices. In this project, we design and develop new technology to assist people with motor impairments to use gesture input on touchscreen devices effectively by employing eye gaze tracking, voice input, and electroencephalography (EEG) analysis.


Concrete objectives

  1. Understand how people with motor impairments perform touch gestures and how eye gaze, speech, and EEG data coordinate with stroke-gesture input.
  2. Develop an efficient touch stroke-gesture recognition algorithm for users with motor impairments assisted by eye gaze, speech, and electroencephalography analysis.
  3. Perform TRL-4 integration and validation of assistive gesture input technology.

Expected results

  1. Software application for collecting stroke-gesture input, eye gaze tracking, EEG, and voice input commands in experimental setups.
  2. Gesture, voice, eye tracking, and EEG datasets collected from people with and without upper body motor impairments.
  3. New recognition algorithm for stroke-gesture input on touchscreen devices.
  4. TRL-4 integration of assistive gesture input technology.
  5. Scientific publications in high-quality conferences and journals and yearly research reports.

Team

  1. Prof. Radu-Daniel Vatavu, Principal Investigator
  2. Prof. Stefan-Gheorghe Pentiuc (Computer Science)
  3. Dr. Ovidiu-Andrei Schipor (Computer Science)
  4. Dr. Doina-Maria Schipor (Psychology)
  5. Dr. Ovidiu-Ciprian Ungurean (Computer Science)
  6. Dr. Ovidiu-Ionut Gherman (Computer Science)
  7. Bogdan-Florin Gheran (PhD student, Computer Science)

Publications

(Top, highly-selective venues, A*, A, and B, ranked by ARC CORE are highlighted in orange.)

  1. Radu-Daniel Vatavu, Ovidiu-Ciprian Ungurean. (2019). Stroke-Gesture Input for People with Motor Impairments: Empirical Results & Research Roadmap. In Proceedings of CHI '19, the 37th ACM Conference on Human Factors in Computing Systems (Glasgow, Scotland, UK, May 2019). ACM, New York, NY, USA, Paper No. 215, 14 pages
    ACCEPTANCE RATE: 23.8% (703/2958) | ARC A* (CORE 2018)
    PDF | DOI | ISI WOS:000474467902066
  1. Maria-Doina Schipor. (2019). Attitude and Self-Efficacy of Students with Motor Impairments Regarding Touch Input Technology. Revista Romaneasca pentru Educatie Multidimensionala, 11(1). Lumen Publishing, 177-186
    PDF | DOI | ISI WOS:000460769100013
  1. Radu-Daniel Vatavu, Lisa Anthony, Jacob O. Wobbrock. (2018). $Q: A Super-Quick, Articulation-Invariant Stroke-Gesture Recognizer for Low-Resource Devices. In Proceedings of MobileHCI '18, the 20th ACM International Conference on Human-Computer Interaction with Mobile Devices and Services (Barcelona, Spain, September 2018). ACM, New York, NY, USA, Article No. 23
    ACCEPTANCE RATE: 23.1% (50/216) | ARC B (CORE 2018)
    PDF | DOI
    HONORABLE MENTION AWARD
  1. Luis A. Leiva, Daniel Martin-Albo, Radu-Daniel Vatavu. (2018). GATO: Predicting Human Performance with Multistroke and Multitouch Gesture Input. In Proceedings of MobileHCI '18, the 20th ACM International Conference on Human-Computer Interaction with Mobile Devices and Services (Barcelona, Spain, September 2018). ACM, New York, NY, USA, Article No. 32
    ACCEPTANCE RATE: 23.1% (50/216) | ARC B (CORE 2018)
    PDF | DOI
  1. Ovidiu-Ciprian Ungurean, Radu-Daniel Vatavu, Luis A. Leiva, and Daniel Martín-Albo. (2018). Predicting stroke gesture input performance for users with motor impairments. In Proceedings of MobileHCI '18 Adjunct, the 20th ACM International Conference on Human-Computer Interaction with Mobile Devices and Services (Barcelona, Spain, September 2018). ACM, New York, NY, USA, 23-30
    ACCEPTANCE RATE: % (/) | ARC B (CORE 2018)
    PDF | DOI
  1. Bogdan-Florin Gheran, Ovidiu-Ciprian Ungurean, Radu-Daniel Vatavu. (2018). Toward Smart Rings as Assistive Devices for People with Motor Impairments: A Position Paper. In Proceedings of RoCHI '18, 15th International Conference on Human-Computer Interaction (Cluj-Napoca, Romania, September 2018). Matrix Rom, 99-106
    PDF | DOI
  1. Bogdan-Florin Gheran, Jean Vanderdonckt, Radu-Daniel Vatavu. (2018). Gestures for Smart Rings: Empirical Results, Insights, and Design Implications. In Proceedings of DIS '18, the 13th ACM Conference on Designing Interactive Systems (Hong Kong, China, June 2018). ACM, New York, NY, USA, 623-635
    ACCEPTANCE RATE: 22.0% (107/487) | ARC B (CORE 2018)
    PDF | DOI
  1. Bogdan-Florin Gheran, Radu-Daniel Vatavu, Jean Vanderdonckt. (2018). Ring x2: Designing Gestures for Smart Rings using Temporal Calculus. In Proceedings of the 2018 ACM Conference Companion Publication on Designing Interactive Systems (Hong Kong, China, June 2018). ACM, New York, NY, USA, 117-122
    ACCEPTANCE RATE: 46.7% (50/107) | ARC B (CORE 2018)
    PDF | DOI ISI WOS:000461155200020
  1. Luis A. Leiva, Daniel Martín-Albo, Réjean Plamondon, Radu-Daniel Vatavu. (2018). KeyTime: Super-Accurate Prediction of Stroke Gesture Production Times. In Proceedings of CHI '18, the 36th ACM Conference on Human Factors in Computing Systems. (Montreal, Canada, April 2018). ACM, New York, NY, USA, Paper 239, 12 pages
    ACCEPTANCE RATE: 25.7% (666/2592) | ARC A* (CORE 2018)
    PDF | DOI
  1. Ovidiu-Ciprian Ungurean, Radu-Daniel Vatavu, Luis A. Leiva, Réjean Plamondon. (2018). Gesture Input for Users with Motor Impairments on Touchscreens: Empirical Results based on the Kinematic Theory. In Proceedings of CHI EA '18, the 36th ACM Conference Extended Abstracts on Human Factors in Computing Systems. (Montreal, Canada, April 2018). ACM, New York, NY, USA, Paper LBW537, 6 pages
    ACCEPTANCE RATE: 39.8% (255/641) | ARC A* (CORE 2018)
    PDF | DOI
  1. Ovidiu-Ionut Gherman, Ovidiu-Andrei Schipor, Bogdan-Florin Gheran. (2018). VErGE: A system for collecting voice, eye gaze, gesture, and EEG data for experimental studies. In Proceedings of DAS '18, the 14th International Conference on Development and Application Systems (Suceava, Romania, May 2018). IEEE, 150-155
    PDF | DOI | ISI WOS:000467080400028
  1. Petru-Vasile Cioata, Radu-Daniel Vatavu. (2018). In Tandem: Exploring Interactive Opportunities for Dual Input and Output on Two Smartwatches. In Proceedings of IUI '18 Companion, the 23rd International Conference on Intelligent User Interfaces Companion (Tokyo, Japan, March 2018). ACM, New York, NY, USA, Article 60
    PDF | DOI ISI WOS:000458680100060
  1. Radu-Daniel Vatavu. (2017). Characterizing Gesture Knowledge Transfer Across Multiple Contexts of Use. Journal on Multimodal User Interfaces 11 (4). Springer, 301-314
    PDF | DOI | ISI WOS:000417622600001
    IF: 1.031 | 5-Year IF: 1.039 (JCR 2016)

Media

  1. Snapshots from the data acquisition experiment. Part 1: stroke-gestures on a touchscreen device
  2. Snapshots from the data acquisition experiment. Part 2: gestures, voice, eye gaze, and EEG
  3. Snapshots from the TRL-4 validation user study for the VErGE (Voice, Eye gaze, Gestures, EEG) assistive application for users with motor impairments

Project reports (in Romanian)

  1. Scientific report 2017 PDF
  2. Scientific report 2018 PDF
  3. Final project report PDF

Other resources (software, algorithms, and datasets)

  1. Software application for data collection in experimental settings RAR | License
  2. VErGE (Voice, Eye gaze, Gesture & EEG) software application RAR | License
  3. $Q stroke-gesture recognition algorithm for touchscreens PDF
  4. Gesture datasets. The following datasets were collected:
    • A dataset of 9,681 touch stroke gestures collected from 70 participants, of which 35 participants with motor impairments.
    • A dataset of 3,600 recordings consisting in audio, gesture, EEG, and eye gaze data from 25 participants, of which 9 participants with motor impairments.
    A demographic description of participants with upper body motor impairments can be downloaded here .
    We will make the datasets once the corresponding articles are accepted and published.
  5. Online demonstrations for the KeyTime technique (Leiva et al., 2018a PDF and Ungurean et al., 2018a PDF ) and GATO technique (Leiva et al. 2018b PDF and Ungurean et al., 2018b PDF ) for predicting the production time of users' stroke-gesture input on touchscreens are available here.