DATASET ID# 2024-HAND-CHAIR-GESTURES

A dataset is provided with 1,620 hand gestures elicited from 54 participants in relation to 15 referents, representing common actions, digital content types, and navigation commands in interactive systems. A notable nuance of these gestures is that they involve articulations performed in relation to the structural parts of various chair designs (armchair, office-chair, stool) or the seated position, i.e., they are "hand-chair gestures." Alongside the computational representations of these gestures, we also release a total of 6,480 self-reported ratings regarding these gestures' ease of use, recall, goodness of fit with the corresponding referents, and social acceptability. The dataset represents a companion resource for our paper, where hand-chair gestures were examined in detail using various gesture articulation, similarity, and perception measures.

Hand-Chair Gestures

Hand-chair gestures are hand movements and poses performed in relation to the chair's structural parts or the seated position, of two kinds:
  • On-chair gestures utilize the chair's surface, enabling touch, stroke, and grasp input, e.g., grasping the seat of a stool (Figure a), touching the backseat of an office-chair (Figure b), or swiping on the armrest of an armchair (Figure c) for fast, always-available tactile input to control an interactive system.
  • From-chair gestures are non-contact, performed around the chair in the user's peripersonal space, e.g., turning a knob in mid-air with the elbow supported (Figure d), hovering a hand above the armrest (Figure e), or performing bimanual input while in the seated position (Figure f).
While on-chair gestures transform the chair into an interactive surface, from-chair gestures extend the interactions beyond the chair's boundaries, as illustrated in the next figure showing actual statistical data about the extent of hand-chair gestures.

Resources

We release our gesture dataset, C# code and gesture ratings data freely available to download and use for research purposes under this license .
The hand-chair gestures from our dataset were recorded using two Tap Strap v2, devices, worn on both hands, and consist of linear 3D acceleration data for each of the ten fingers. They correspond to the following 15 referents:
  • Place/answer call: answer/end an incoming phone call
  • Set/cancel alarm: activate/deactivate the most recent alarm (the alarm is set if off and vice versa)
  • Turn on/off lights: turn on/off the lights (lights turn on if they are off and vice versa)
  • Turn on/off TV: turn on/off the TV (the TV turns on if it is off and vice versa)
  • Turn on/off AC: turn on/off the air conditioner (the air conditioner turns on if off and vice versa)
  • Photos and videos: get direct access to photos/videos; the first photo is displayed on a screen
  • Music: get direct access to music; the first file starts playing
  • Messages: get direct access to messages; the most recent message is displayed on a screen
  • Agenda/calendar: get direct access to the agenda/calendar, displayed on a screen
  • Contacts: get direct access to phone contacts, which are displayed on a screen
If you find our dataset and/or code useful in your work, please let us know. If you use our dataset and/or code in scientific publications, please reference our paper that introduced these resources.

Publication

Contact

For any information or suggestion regarding this web page, the dataset, or the source code, please contact Prof. Radu-Daniel Vatavu.