Hybrid Target Selections by ”Hand Gestures + Facial Expression” for a Rehabilitation Robot

Yi Han, Xiangliang Zhang, Ning Zhang, Shuguang Meng, Tao Liu, Shuoyu Wang, Min Pan, Xiufeng Zhang, Jingang Yi

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

In this study we propose a “hand gesture + face expression” human machine interaction technique, and apply this technique to bedridden rehabilitation robot. “Hand gesture + Facial expression” interactive technology combines the input mode of gesture and facial expression perception. It involves seven basic facial expressions that can be used to determine a target selecting task, while hand gestures are used to control a cursor’s location. A controlled experiment was designed and conducted to evaluate the effectiveness of the proposed hybrid technology. A series of target selecting tasks with different target widths and layouts were designed to examine the recognition accuracy of hybrid control gestures. An interactive experiment applied to a rehabilitation robot is designed to verify the feasibility of this interactive technology applied to rehabilitation robots. The experimental results show that the “hand + facial expression” interactive gesture has strong robustness, which can provide a novel guideline for designing applications in VR interfaces, and it can be applied to the rehabilitation robots.
Original languageEnglish (US)
JournalSensors
Volume23
Issue number1
DOIs
StatePublished - Jan 1 2023
Externally publishedYes

ASJC Scopus subject areas

  • Biochemistry
  • Atomic and Molecular Physics, and Optics
  • Analytical Chemistry
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Hybrid Target Selections by ”Hand Gestures + Facial Expression” for a Rehabilitation Robot'. Together they form a unique fingerprint.

Cite this