1
|
Sivakanthan S, Candiotti JL, Sundaram AS, Duvall JA, Sergeant JJG, Cooper R, Satpute S, Turner RL, Cooper RA. Mini-review: Robotic wheelchair taxonomy and readiness. Neurosci Lett 2022; 772:136482. [PMID: 35104618 PMCID: PMC8887066 DOI: 10.1016/j.neulet.2022.136482] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2021] [Revised: 01/21/2022] [Accepted: 01/24/2022] [Indexed: 01/05/2023]
Abstract
Robotic wheelchair research and development is a growing sector. This article introduces a robotic wheelchair taxonomy, and a readiness model supported by a mini-review. The taxonomy is constructed by power wheelchair and, mobile robot standards, the ICF and, PHAATE models. The mini-review of 2797 articles spanning 7 databases produced 205 articles and 4 review articles that matched inclusion/exclusion criteria. The review and analysis illuminate how innovations in robotic wheelchair research progressed and have been slow to translate into the marketplace.
Collapse
Affiliation(s)
- Sivashankar Sivakanthan
- Human Engineering Research Laboratories, VA Pittsburgh Healthcare System, Pittsburgh, PA, USA; Human Engineering Research Laboratories, School of Health and Rehabilitation Sciences, Pittsburgh, PA, USA
| | - Jorge L Candiotti
- Human Engineering Research Laboratories, VA Pittsburgh Healthcare System, Pittsburgh, PA, USA; Human Engineering Research Laboratories, School of Health and Rehabilitation Sciences, Pittsburgh, PA, USA
| | - Andrea S Sundaram
- Human Engineering Research Laboratories, VA Pittsburgh Healthcare System, Pittsburgh, PA, USA; Human Engineering Research Laboratories, School of Health and Rehabilitation Sciences, Pittsburgh, PA, USA
| | - Jonathan A Duvall
- Human Engineering Research Laboratories, VA Pittsburgh Healthcare System, Pittsburgh, PA, USA; Human Engineering Research Laboratories, School of Health and Rehabilitation Sciences, Pittsburgh, PA, USA
| | | | - Rosemarie Cooper
- Human Engineering Research Laboratories, VA Pittsburgh Healthcare System, Pittsburgh, PA, USA; Human Engineering Research Laboratories, School of Health and Rehabilitation Sciences, Pittsburgh, PA, USA
| | - Shantanu Satpute
- Human Engineering Research Laboratories, VA Pittsburgh Healthcare System, Pittsburgh, PA, USA; Human Engineering Research Laboratories, School of Health and Rehabilitation Sciences, Pittsburgh, PA, USA; Department of Bioengineering, Swanson School of Engineering, University of Pittsburgh, Pittsburgh, PA, USA
| | - Rose L Turner
- Health Science Library System, University of Pittsburgh, Pittsburgh, PA, USA
| | - Rory A Cooper
- Human Engineering Research Laboratories, VA Pittsburgh Healthcare System, Pittsburgh, PA, USA; Human Engineering Research Laboratories, School of Health and Rehabilitation Sciences, Pittsburgh, PA, USA.
| |
Collapse
|
2
|
Sharan RV, Berkovsky S, Liu S. Voice Command Recognition Using Biologically Inspired Time-Frequency Representation and Convolutional Neural Networks. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2020; 2020:998-1001. [PMID: 33018153 DOI: 10.1109/embc44109.2020.9176006] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Voice command is an important interface between human and technology in healthcare, such as for hands-free control of surgical robots and in patient care technology. Voice command recognition can be cast as a speech classification task, where convolutional neural networks (CNNs) have demonstrated strong performance. CNN is originally an image classification technique and time-frequency representation of speech signals is the most commonly used image-like representation for CNNs. Various types of time-frequency representations are commonly used for this purpose. This work investigates the use of cochleagram, utilizing a gammatone filter which models the frequency selectivity of the human cochlea, as the time-frequency representation of voice commands and input for the CNN classifier. We also explore multi-view CNN as a technique for combining learning from different time-frequency representations. The proposed method is evaluated on a large dataset and shown to achieve high classification accuracy.
Collapse
|
3
|
Letaief M, Rezzoug N, Gorce P. Comparison between joystick- and gaze-controlled electric wheelchair during narrow doorway crossing: Feasibility study and movement analysis. Assist Technol 2019; 33:26-37. [PMID: 30945980 DOI: 10.1080/10400435.2019.1586011] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022] Open
Abstract
Due to motor deficiencies inducing low force capabilities or tremor, many persons have great difficulties to use joystick-operated wheelchairs. To alleviate such difficulties, alternative interfaces using vocal, gaze, or brain signals are now becoming available. While promising, these systems still need to be evaluated thoroughly. In this framework, the aims of this study are to analyze and evaluate the behavior of 11 able-bodied subjects during a navigation task executed with gaze or joystick-operated electric wheelchair involving a door crossing. An electric wheelchair was equipped with retroreflective markers, and their movements were recorded with an optoelectronic system. The gaze commands were detected using an eye tracking device. Apart from classical, forward, backward, stop, left and right commands, the chosen screen-based interface integrated forward-right and forward-left commands. The global success rate with the gaze-based control was 80.3%. The path optimally ratio was 0.97 and the subject adopted similar trajectories with both systems. The results for gaze control are promising and highlight the important utilization of the forward-left and forward-right commands (25% of all issued commands) that may explain the similarity between the trajectories using both interfaces.
Collapse
Affiliation(s)
- Manel Letaief
- Laboratoire de Biomodélisation et d'Ingénierie des Handicaps, Université de Toulon , Toulon -var, La Garde, France
| | - Nasser Rezzoug
- Laboratoire de Biomodélisation et d'Ingénierie des Handicaps, Université de Toulon , Toulon -var, La Garde, France
| | - Philippe Gorce
- Laboratoire de Biomodélisation et d'Ingénierie des Handicaps, Université de Toulon , Toulon -var, La Garde, France
| |
Collapse
|