چكيده
ABSTRACT
One important component of the upper extremity neuroprostheses is the user command
interface which enables the users to control the grasping and releasing of their own hand. This
work presents a brain-computer interface (BCI) for on-line control of hand grasping, opening,
and holding in an interactive virtual reality environment. The mental states to be
discriminated are imagination of hand grasping, opening, and idle state. To minimize the
classification error and to improve accuracy, identifying the most characterizing features of
the observed data, is critical for many pattern recognition applications. In this work, we use
mutual information to select features which jointly have largest dependency on the target
class. The results of experiments on five subjects show that an average accuracy as high as
90% was achieved. The best classification rate obtained is 100%.