Salta al contenuto principale
Passa alla visualizzazione normale.

MARCO LA CASCIA

Hankelet-based action classification for motor intention recognition

  • Autori: Dindo, H.; Lo Presti, L.; La Cascia, M.; Chella, A.; Dedić, R.
  • Anno di pubblicazione: 2017
  • Tipologia: Articolo in rivista (Articolo in rivista)
  • OA Link: http://hdl.handle.net/10447/235083

Abstract

Powered lower-limb prostheses require a natural, and an easy-to-use, interface for communicating amputee's motor intention in order to select the appropriate motor program in any given context, or simply to commute from active (powered) to passive mode of functioning. To be widely accepted, such an interface should not put additional cognitive load at the end-user, it should be reliable and minimally invasive. In this paper we present a one such interface based on a robust method for detecting and recognizing motor actions from a low-cost wearable sensor network mounted on a sound leg providing inertial (accelerometer, gyrometer and magnetometer) data in real-time. We assume that the sensor measurement trajectories – in a given temporal window – can be represented as the output of a linear time invariant system. We describe such set of trajectories via a Hankel matrix, which embeds the observability matrix of the LTI system generating the set of trajectories. The use of Hankel matrices (known as Hankelets) avoids the burden of performing system identification while providing a computationally convenient descriptor for the dynamics of a time-series. For the recognition of actions, we use two off-the-shelf classifiers, nearest neighbor (NN) and support vector machines (SVM), in cross-subject validation. We present results using either the joint angles or the raw sensor data showing a net improvement of the Hankelet-based approach against a baseline method. In addition, we compare results on action recognition using joint angles provided by trakSTAR, a high-accuracy motion tracking unit, demonstrating – somewhat surprisingly – that best results (in terms of average recognition accuracy over different actions) are provided by raw inertial data, paving the way towards a wider usage of our method in the field of active prosthetics, in particular, and motor intention recognition, in general.