Skip to content

Y1Q3

This application has been developed during the second semester of 2018, with the aim of significantly improving Y1M5 by considering the following aspects:

  1. increasing the motor functioning of R1: R1 physically shows to the patient the movement to replicate, removing the need of visualizing the movement on the screen. This improvement is intended to support the interaction between robot and patient, promoting the use of a robotic platform rather than a simple fixed camera, with the final aim of improving the quality of rehabilitation of a patient;

  2. extending the motion analysis to the end-point: the trajectory of the end-point (hand) is evaluated during a reaching task, including also the speed profile;

  3. extending the provided feedback: the feedback includes important properties of the movement, such as the speed and the range of motion, improving significantly the capability of correction.

Including the mentioned point, the application considers the following steps:

  1. R1 engages the user, proposing a rehabilitative exercise of the upper limbs;
  2. R1 physically shows the exercise the user has to perform from a predefined repertoire:
Abduction Internal rotation External rotation

  1. R1 analyzes in real-time the execution of the exercise, providing an extended verbal feedback which includes the speed of the movement, the range of motion and how well the target has been reached;
  2. the analyzed metric is updated in real-time and shown on the screen;
  3. R1 processes a final report of the exercise which includes the information acquired during the interaction;
  4. R1 keeps engaging a user, proposing movements from the repertoire which have not been proposed before, by using face recognition and stored data related to past interactions;

Note

The code implementing this application is tagged as v0.2.0.