Virtual reality opens the door to the future of rehabilitation

Virtual reality opens the door to the future of rehabilitation

How can a commercially-available tracking system that was originally used for gaming, be applied to recognise and monitor the performance of exercises used in rehabilitation? This question guided the work in my Master Thesis project for the BioRobotics Institute of Sant’Anna School of Pisa, Italy that I carried out in the London Exploration Lab of Konica Minolta Laboratory Europe. A description of the project is given in the accompanying video and explained further below.

Setting up the tracking system

The first step was to connect an Avatar VR, a motion-tracking device produced by Neurodigital Technologies, to my computer and record the data that it generated.

Avatar VR is a system comprising arms and chest tracking bands and a pair of sensorised gloves. Each glove features one Inertial Measurement Unit (IMU) on each finger except for the thumb, which has two, and one central IMU on the back of the palm, resulting in a system of seven IMUs that measure the movement of the hand and its digits in real time. There is also one additional IMU for each of the arm and chest track band.

Each of the IMUs from fingers, arms, forearm and chest produce a unit quaternion, that is a set of four numbers that can uniquely identify changes in orientation in 3D space: this is the kind of data that is usually preferred for 3D animation and was used to perform our analysis. The IMU on the palm also provides accelerometer data.

Avatar VR can send data via USB or wirelessly with a Bluetooth connection. When the system is connected to a computer, data is sent at a minimum speed of 63.2 frames per second (when using both gloves in parallel) and up to 65 frames per second (when only one glove is connected). Data can be acquired with different programming languages and software, thanks to the libraries for Unity, C++ and C#. For my project, data acquisition was handled by a C++ script that I developed with the Software Development Kit (SDK) available for customisation.

Following the instructions on the Get Started page, I was able to install the libraries and start testing the system with basic movements and oscillations that can be felt on the hand as vibration by enabling the built-in glove actuators.

Collecting data from the gestures of right-handed volunteers

Functional rehabilitation exercises are aimed at regaining lost skills by interacting with objects that patients would commonly need to grasp during their daily life. This is why performance measures of hand function usually rely on a physician who rates the ability to accomplish so-called activities of daily living (ADL).

Following the state of the art approach, the reliability of the glove was assessed by comparing finger motion/flexion recorded from two healthy subjects during the performance of a task inspired by ADL. The angles measured at each point in time were added up among the signals coming from four digits; the similarity of the resulting finger bending trajectories was evaluated based on the Intraclass Correlation Coefficient (ICC) that is a statistical parameter assessing the reliability of a series of measurements. The average ICC was 0.87, a number greater than 0.70, which is considered a criterion of acceptability: this means that Avatar VR system is a reliable tracker of the typical movements that might be used for hand rehabilitation exercises.

Thirteen right-handed, healthy volunteers from Konica Minolta’s office in London took part in the data collection with a dataset comprising four different tasks. The chosen tasks were based on ADL, since they are commonly used to evaluate the recovery of hand function in the field of rehabilitation. Each volunteer was recorded while performing each of the tasks, repeated ten times: grasping a glass, a hairbrush, a telephone and a toothbrush.

Analysis and classification of gesture data

Subsequent data analyses comprised the following steps:

  • Segmentation and pre-processing: ten single cycles of object reaching, grasping and releasing were cut from the whole recordings by means of MATLAB scripts; each of them was time-normalised to one hundred samples.
  • Feature extraction: performed with NumPy library on Python.
  • Feature selection: four different algorithms were evaluated that are available in the scikit-learn library for Python.
  • Classification: performed by comparing the number of correctly classified dataset samples achieved by different algorithms available in the Scikit-learn library; each model was trained on twelve subjects and tested on the last unseen one, and this procedure was repeated for all subjects.

Finally, we defined a distance metric to assess the quality of a new movement recorded from a rehabilitation patient, with reference to the ‘nominal’ standard movement provided by the healthy subjects.

Grasping a toothbrush or a telephone?

With a total of 520 task events, equally distributed into the four classes, we were able to reach a classification accuracy of 90%. This enabled us to correctly classify gestures from unknown subjects whose new samples are recorded through an interactive platform that provides immediate onscreen feedback.

Curiously, we noticed that the most relevant features were extracted from first and central parts of the temporal profiles, to recognise gestures using only the first half or even as little as the first third of the full-movement data: in these conditions we still achieve accuracy values between 87% and 90%.

Feature-selected-with-RFE

RFE stands for Recursive Feature Elimination method, MLP stands for Multilayer Perceptron classifier and SVM stands for Support Vector Machine classifier

Therefore, we concluded that the tracking system is reliable for rehabilitation applications, even though with some limitations that are related to variability in hand sizes and slippage of the glove fingertips, as for its current design. Concerning the classification task and its computational cost two main outcomes should be noted:

  • The most informative signals are those coming from palm, thumb and index fingers, therefore the glove could be adapted, without any significant loss of accuracy, into a wearable device limited to these three sensors, so facilitating easier application to patients with hand deformities.
  • The first part of the movement is enough for its classification so that computational requirements can be minimised; this also suggests that healthy subjects adopt a pre-shaping strategy of the hand during the approach, as has been reported in the literature describing neurological research studies on infants and blindfolded adults.

This project, as a facet of Konica Minolta Laboratory activities within the Digital Healthcare context, represents the first steps towards future affordable systems that will be integrated in the home environment, opening up to tele-rehabilitation, where the recovery process will be assessed in a non-intrusive way and data sent to physicians will enable them to check the patient’s progress.

X