Radu-Daniel Vatavu

13 Universitatii · Suceava, 720229 · Romania · EU     radu.vatavu@usm.ro    Skype: radu.vatavu

Radu-Daniel Vatavu I am a Professor of Computer Science (HDR) at the "Ștefan cel Mare" University of Suceava, where I conduct research in Human-Computer Interaction (HCI), Ambient Intelligence (AmI), and Entertainment Computing. I direct the Machine Intelligence and Information Visualization Lab (MintViz), an interdisciplinary research laboratory within the MANSiD Research Center.

While I am broadly interested in Human-Computer Interaction, I have focused prioritarily on gesture technology for effective interaction with computing systems, from large-scale installations to personal mobile and wearable devices and gadgets. I am also interested in inclusive design for all abilities, and my work has often addressed UI design for young children or people with visual or motor impairments; see the ongoing Research Projects section.

My work has received paper awards at ACM's specialized conferences, such as MobileHCI (2018), CHI (2016, 2015), TVX (2015), and ICMI (2012), and has been regularly recognized with Research Awards (2018, 2017, 2015, 2013, 2012) by UEFISCDI, the Romanian Executive Agency for Funding Higher Education, Research, Development, and Innovation.

I regularly release, with the help of collaborators, software tools, source code, and datasets that are freely available to download and use for research purposes, such as the $P, $P+, and $Q gesture recognizers, the Agreement Analysis Toolkit (AGATe) for gesture elicitation studies, the Gesture Heatmaps Toolkit (GHoST), or the KeyTime and GATO web apps for predicting touch gesture production times; see the Code, Tools, and Datasets section for all the resources.

A complete, up to date Curriculum Vitae is available here .


  • ACM

  • DBLP

  • ORCID

  • Springer

  • Scholar

  • ResearchGate

  • channel

  • LinkedIn

Selected Videos

For more research videos, please see my YouTube channel here .

Recent papers in conferences and journals

Research Team

I direct the Machine Intelligence and Information Visualization Lab (MintViz), an interdisciplinary research laboratory within the MANSiD Research Center of the "Ștefan cel Mare" University of Suceava.

Our research efforts are directed toward design, development, and evaluation of new artificial intelligence (AI) and information visualization (InfoVis) technology to support innovative, rich interaction between humans, computers, and environments to advance knowledge in Human-Computer Interaction, Ambient Intelligence, and Entertainment Computing. Towards this end, we have been interested in essential topics from HCI and AmI, such as applying pattern recognition and machine learning techniques to human-computer interaction and information visualization, designing touch, free-hand, and whole-body gesture interactions, prototyping new interactive techniques for smart mobiles and wearables, designing for new user experience in smart environments, protoyping for home entertainement and the interactive TV, and explorations of ambient media concepts.

I am always looking for talented students to work on HCI related projects. If you are interested, send me an email.

Research Projects

I am grateful for support from UEFISCDI Romania (the Romanian Executive Agency for Funding Higher Education, Research, Development, and Innovation), Agence Universitaire de la Francophonie (AUF), Wallonie-Bruxelles International (WBI), Belgium, OeAD Austria, the Ministry of Science and Technology, China, and the European Commission through the European Social Fund, the FP7 and COST programmes, the Increase of Economic Competitiveness Fund, and for support from the EUCogIII network. My students have also benefited from SIGCHI Student Travel Grants from ACM SIGCHI.

Below is a list of current and past research projects funded with the support of the institutions mentioned above.

MotorSkill: Effective Gesture Interactions with Touch Surfaces for Motor Impairment Conditions

Funded by UEFISCDI, Romania, August 2017 - December 2018
Touch input on today's touchscreen devices requires many dexterous abilities. Unfortunately, upper body motor impairments, such as spinal cord injury, cerebral palsy, or muscular dystrophy, impede formation of an effective hand pose to touch the screen precisely and confidently. The goal of this project is to develop new technology to assist people with motor impairments to use gesture input on touchscreen devices effectively. To this end, we explore eye gaze tracking, voice input, and electroencephalography (EEG) analysis.
Project home page

Gesture Interfaces for Visually-Impairing Interaction Contexts

Funded by UEFISCDI, Romania, October 2015 - September 2017
Because smart mobile devices expose touchscreens little adapted to non-visual input, people with visual impairments need to employ various workaround strategies to be able to use such devices effectively and independently. The goal of this project is to understand the effects of visual impairments the gesture performance of people with low vision on touchscreen devices and to inform the development of new algorithms to recognize gestures effectively under visually-impairing contexts of use.
Project home page

Computational Psychology of Human Movement to Understand Gestures and Body Kinesics

Funded by UEFISCDI, Romania and Wallonie-Bruxelles International, Belgium, Jan. 2017 - Dec. 2018
Co-PI: Jean Vanderdonckt
Bilateral cooperation project with Université catholique de Louvain, Belgium, to finance short mobility stages and foster scientific cooperation between the two institutions. The goal of the cooperation is to empower the community with relevant software tools for gesture analysis, which we expect to foster new developments in whole-body gesture interfaces for interactive applications.
Project home page

Interaction Techniques with Massive Data Clouds in Smart Environments

Funded by UEFISCDI, Romania and Ministry of Science and Technology, P.R. China, October 2016 - December 2017; July 2018 - December 2018
Co-PI: Wenjun Wu
Bilateral cooperation project with Beihang University, China, to finance short mobility stages and foster scientific cooperation between the two institutions. The goal of the cooperation is to design and develop vizualization techniques for large data stored in the cloud using large ambient displays.
Project home page

Multimodal Feedback for Supporting Gestural Interaction in Smart Environments

Funded by UEFISCDI, Romania and OeAD, Austria, Jan. 2014 - Dec. 2015
Co-PI: Hannes Kaufmann
Bilateral cooperation project with Technical University of Vienna, Austria, to finance short mobility stages and foster scientific cooperation between the two institutions. The goal of the cooperation is to acquire new scientific knowledge for the design and development of feedback techniques for gesture interfaces.
Project home page

Resources: Code, Tools, and Datasets

This section is under maintenance. Sorry for the temporary inconvenience.