Deep Learning for Hand Gesture Recognition in Virtual Museum Using Wearable Vision Sensors

Nabil Zerrouki, Fouzi Harrou*, Amrane Houacine, Riadh Bouarroudj, Mohammed Yazid Cherifi, Ait Djafer Amina Zouina, Ying Sun

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

Hand gestures facilitate user interaction and immersion in virtual museum applications. These gestures allow users to navigate virtual exhibitions, interact with virtual artifacts, and control virtual environments naturally and intuitively. This study introduces a deep learning-driven approach for hand gesture recognition using wearable vision sensors designed for interactive virtual museum environments. The proposed approach employs an image-based feature extraction strategy that focuses on capturing five partial occupancy areas of the hand. Notably, a deep learning strategy using the bidirectional long short-term memory (Bi-LSTM) model is adopted to construct an effective model for hand gesture identification. The bidirectionality of Bi-LSTM enables it to capture dependencies in both forward and backward directions, providing a more comprehensive understanding of temporal relationships in the data. The bidirectional nature allows the model to better capture the dynamics and complexities of hand motions, leading to improved accuracy and robustness. The performance evaluation includes experiments on publicly available datasets, considering virtual and real museum scenarios. The results highlight the Bi-LSTM-based approach's superiority by accurately distinguishing various hand gestures. The experimental findings demonstrate that combining the five area ratios and Bi-LSTM classification enables robust recognition of diverse hand gestures and effectively discriminates between similar actions, such as slide left and right classes. Additionally, it shows promising detection performance compared to conventional machine learning models and state-of-the-art (SOTA) methods. The presented approach is promising for enhancing user interaction and immersion in virtual museum experiences.

Original languageEnglish (US)
Pages (from-to)8857-8869
Number of pages13
JournalIEEE Sensors Journal
Volume24
Issue number6
DOIs
StatePublished - Mar 15 2024

Keywords

  • Bidirectional long short-term memory (Bi-LSTM) classification
  • ego-centric vision devices
  • feature extraction
  • hand gesture recognition
  • wearable vision

ASJC Scopus subject areas

  • Instrumentation
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Deep Learning for Hand Gesture Recognition in Virtual Museum Using Wearable Vision Sensors'. Together they form a unique fingerprint.

Cite this