WearSkill: Motor-Streamlined Interactions with Smart Wearables

Project summary
Project no.: PN-III-P2-2.1-PED-2019-0352; Contract no.: 276PED/2020
Principal Investigator: Radu-Daniel Vatavu
Funded by UEFISCDI, Romania
Funding scheme: PNIII P2 - Increasing economic competitiveness through research, development and innovation (PED)
Running period: August 2020 - August 2022 (24 months)


The goal of this project is to develop new input technology to increase the accessibility of smart wearables, such as glasses, watches, and rings, for people with upper-body motor impairments. To this end, we propose the concept of motor-streamlined input for wearables, which subsumes gesture input (in the form of touch, stroke-gestures, mid-air input, or head movements) and voice input according to a design, implementation, and evaluation paradigm centered on the specific motor abilities of the end user.

Figure 1. In contrast to smartphones with relatively large input surfaces for touch input (rightmost image), wearable devices, such as smartglasses, smart rings, and smartwatches (left and middle images), present specific interaction challenges for people with motor impairments.


We are developing a TRL-4 experimental model for motor-streamlined input with smart wearables. Our concrete objectives are:
  1. Examine interactions performed by people with upper-body motor impairments with smart glasses, watches, and rings.
  2. Design and implement techniques (TRL-3) for effective input on smart glasses, watches, and rings.
  3. Integrate and validate input techniques in the form of an experimental model (TRL-4).

Expected results

The main result of this project will be an experimental model (TRL-4) to enable motor-streamlined use of smart wearables for users with upper-body motor impairments. Associated results are:
  1. Method to analyze gesture and voice input produced by people with motor impairments using wearable devices.
  2. Recognition results for gestures performed using smart glasses, watches, and rings.
  3. Experimental results and dataset of gestures performed using smart glasses, watches, and rings.
  4. Scientific publications in high-quality conferences and journals and yearly research reports.


  1.  Prof. Radu-Daniel Vatavu, Principal Investigator
  2.  Dr. Ovidiu-Andrei Schipor
  3.  Dr. Ovidiu-Ciprian Ungurean
  4.  Alexandru-Ionuț Șiean (PhD candidate in Computer Science)
  5.  Laura-Bianca Bilius (PhD candidate in Computer Science)


  1. Ovidiu-Ciprian Ungurean, Radu-Daniel Vatavu. (2022). "I Gave up Wearing Rings:" Insights on the Perceptions and Preferences of Wheelchair Users for Interactions with Wearables. IEEE Pervasive Computing, 10 pages
    PDF | DOI | 5-Year IF: 4.916 (JCR 2021) | WOS: 000779680900001 | A rank (ARC CORE)
  1. Ovidiu-Andrei Schipor, Radu-Daniel Vatavu. (2022). GearWheels: A Software Tool to Support User Experiments on Gesture Input with Wearable Devices. International Journal of Human-Computer Interaction. Taylor & Francis
    PDF | DOI | 5-Year IF: 4.503 (JCR 2021) | WOS: 000828950400001
  1. Radu-Daniel Vatavu, Ovidiu-Ciprian Ungurean. (2022). Understanding Gesture Input Articulation with Upper-Body Wearables for Users with Upper-Body Motor Impairments. Proceedings of CHI '22, the ACM Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, Article no. 2, 1-16
    PDF | DOI | ACCEPTANCE RATE: 24.7% (637/2579) | A* rank (ARC CORE)
  1. Ovidiu-Andrei Schipor, Laura-Bianca Bilius, Ovidiu-Ciprian Ungurean, Alexandru-Ionuț Siean, Alexandru-Tudor Andrei, Radu-Daniel Vatavu. (2022). Personalized Wearable Interactions with WearSkill. Proceedings of W4A '22, the 19th Web for All Conference. ACM, New York, NY, USA, 8:1-8:2
  1. Ovidiu-Andrei Schipor, Laura-Bianca Bilius, Radu-Daniel Vatavu. (2022). WearSkill: Personalized and Interchangeable Input with Wearables for Users with Motor Impairments. Proceedings of W4A '22, the 19th Web for All Conference. ACM, New York, NY, USA, 10:1-10:5
    PDF | DOI | ACCEPTANCE RATE: 50% (18/36)
  1. Radu-Daniel Vatavu, Laura-Bianca Bilius. (2021). GestuRING: A Web-based Tool for Designing Gesture Input with Rings, Ring-Like, and Ring-Ready Devices. Proceedings of UIST '21, the 34th Annual ACM Symposium on User Interface Software and Technology. ACM, New York, NY, USA, 710-723
    PDF | DOI | ACCEPTANCE RATE: 25.9% (95/367) | A* rank (ARC CORE)
  1. Laura-Bianca Bilius, Radu-Daniel Vatavu. (2021). Demonstration of GestuRING, a Web Tool for Ring Gesture Input. The Adjunct Publication of UIST '21, the 34th Annual ACM Symposium on User Interface Software and Technology. ACM, New York, NY, USA, 124-125
    PDF | DOI | A* rank (ARC CORE)
  1. Ovidiu-Ciprian Ungurean, Radu-Daniel Vatavu. (2021). Users with Motor Impairments' Preferences for Smart Wearables to Access and Interact with Ambient Intelligence Applications and Services. Proceedings of ISAmI '21, the 12th International Symposium on Ambient Intelligence. Lecture Notes in Networks and Systems, Springer Nature Switzerland, 10 pages
  1. Alexandru-Ionut Siean, Radu-Daniel Vatavu. (2021). Wearable Interactions for Users with Motor Impairments: Systematic Review, Inventory, and Research Implications. Proceedings of ASSETS '21, the 23rd International ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY, USA, Article No. 7, 1-15
    PDF | DOI | ACCEPTANCE RATE: 29.0% (36/124) | A rank (ARC CORE)
  1. Ovidiu-Andrei Schipor, Radu-Daniel Vatavu. (2021). Software Architecture Based on Web Standards for Gesture Input with Smartwatches and Smartglasses. Proceedings of MUM '21, the 20th International Conference on Mobile and Ubiquitous Multimedia (Leuven, Belgium). ACM, New York, NY, USA, 186-188
    PDF | DOI | B rank (ARC CORE)
  1. Octav Opaschi, Radu-Daniel Vatavu. (2020). Uncovering Practical Security and Privacy Threats for Connected Glasses with Embedded Video Cameras. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 4(4). ACM, Article no. 167
    PDF | DOI
  1. Radu-Daniel Vatavu, Jean Vanderdonckt. (2020). Design Space and Users' Preferences for Smartglasses Graphical Menus: A Vignette Study. Proceedings of MUM '20, the 19th ACM International Conference on Mobile and Ubiquitous Multimedia. ACM, NY, USA
    PDF | DOI | ACCEPTANCE RATE: 40.0% (32/80) | B rank (ARC CORE)



Project reports (in Romanian)

  1. Scientific report 2020-2022 PDF
  2. Scientific report 2022 PDF
  3. Scientific report 2021 PDF
  4. Scientific report 2020 PDF

Other resources (software, diagrams, and datasets)

  1. Hand gesture set for wearable interactions (high-resolution PDF ) and gestures performed using the head and feet (high-resolution PDF ). These gestures represent a compilation of 92 possible interactions with smartglasses, head-mounted displays (HMDs), smartwatches, earsets, headsets, on-body-located touch pads, smart armbands, and foot-mounted devices that were extracted, classified, and analyzed during a Systematic Literature Review of the scientific literature on accessible wearable interactions.
    Artwork copyright © 2021 Alexandru-Ionut Siean and Radu-Daniel Vatavu.
  2. Association map highlighting preferences of users with motor impairments for applications of smart wearables (high-resolution PDF ).
  3. An overview of the perceptions of 21 people with motor impairments, wheelchair users, regarding smart wearable devices (high-resolution PDF ) and a dataset of 2,625 records about perceptions of fitness trackers, smartwatches, smartglasses, earbuds, and rings.
  4. Software applications demonstrating TRL-3 level implementation for various input modalities for smart wearables: (1) motion gesture input on the Samsung Galaxy Watch 3 RAR , (2) touch gesture input on a smart ring implemented with the Gear Fit 2 device RAR , and (3) voice input on the Vuzix Blade smartglasses RAR . Software is released under this License .
  5. GearWheels software application (node.js, HTML/JavaScript) for conducting data collection experiments with wearable devices RAR | License
  6. GestuRING web application for ring gestures: link .
    GestuRING is a tool meant to assist HCI researchers and practitioners interested in designing gesture input for ring, ring-like, and ring-ready devices. GestuRING features a large amount of information that is readily available to inform design decisions about which ring gestures to effect which function in an interactive system and to inspire new gesture designs by building on a body of scientific knowledge that is systematically structured.
  7. Two datasets are available here regarding gesture input with smartwatches, glasses, and rings. The first dataset contains 7,290 stroke-gesture samples collected from 28 participants, of which 14 participants with upper-body motor impairments. The second one contains 3,809 motion-gesture samples collected from the same participants.
  8. WearSkill software application (node.js, HTML/JavaScript) for personalized interactions with wearable devices and users with motor impairments.
    RAR | License