Today: Dec 6, 2024
RU / EN
Last update: Oct 30, 2024

Exoskeleton Control System Based on Motor-Imaginary Brain–Computer Interface

Gordleeva S.Yu., Lukoyanov M.V., Mineev S.A., Khoruzhko M.A., Mironov V.I., Kaplan A.Ya., Kazantsev V.B.

Key words: brain–computer interface; motor imagery; lower limb exoskeleton; exoskeleton control system; post-stroke rehabilitation.

The aim of the investigation was to develop the neuro-integrated control system for a lower-limb robotic exoskeleton (RE) using brain–computer interface (BCI) technology based on recognition of EEG patterns evoked by motor imagery of limb movement.

Materials and Methods. The proposed neuro-integrated RE control system based on BCI technology consists of three main modules: EEG signal recording module, EEG signal classifier and the software for transmission of commands to RE. EEG patterns evoked by motor imagery are recognized by the classifier based on linear discriminant analysis that uses the features identified by spatial filtering applying CSP method for all types of commands pairwise. The proposed algorithms for classification of motor imagery patterns and user training techniques make it possible to reliably distinguish several (up to 4) different commands. After training and testing the classifier, the operator may proceed to control the external device, i.e. the lower-limb RE. RE control software has been developed for easy system customization. The software has a simple graphical user interface and allows the user to change the mapping of RE patterns and commands in the operation process.

Results. As a result of testing in 14 healthy volunteers, the average accuracy of lower limb exoskeleton control based on the developed motor imagery BCI for three commands was found to average 70% in three sessions.

Conclusion. The developed RE control system based on BCI technology offers fairly high accuracy for three commands. The operators successfully learn to practice motor imagery and operate the BCI contour, even if they have no previous experience of work with brain–machine interfaces.


Today, one of the fastest growing areas of medical technology is development and manufacturing of neuro-integrated devices based on brain–computer interface (BCI) technology combined with the use of robots (orthoses, exoskeletons) for neurorehabilitation purposes [1, 2]. The impetus for this growth was the discovery of training-induced plastic changes in the functional topography of the primary motor cortex [3]. It is shown in the studies by Bach-Y-Rita and Taub [4, 5] that movement can be recovered even several years after stroke. This has provided new opportunities for neurorehabilitation: the strategy of intensive, regular and motivated movement training has been developed [6]. Exoskeletons appeared to be ideal technical devices for this strategy implementation. Different types of exoskeletons have already been actively used in the clinic and the number of these developments has been growing exponentially in the last 10 years [7].

Exoskeletons used successfully for neurorehabilitation involve devices reproducing the movements of the upper limb joints [8] and devices focused on simulation of walking. Among the latter, we can distinguish exoskeletons performing biomechanically correct movements in the hip and knee joints of the patient such as LOPES (lower extremity powered exoskeleton) [9], ALEX (active leg exoskeleton) [10], ReWalk (wearable robotic exoskeleton) [http://rewalk.com/], eLEGS (exoskeleton lower extremity gait system) [http://bleex.me.berkeley.edu/research/exoskeleton/elegs], Rex (robotic exoskeleton) [http://www.rexbionics.com/], MINDWALKER [11] and others.

Despite the fact that currently there are several types of BCI known [12, 13], the most popular types of non-invasive BCI to control exoskeletons are the so-called synchronous BCIs based on registration of the operator EEG response to the external stimulus environment. They involve resonance frequency BCIs based on the well-known property of spontaneous EEG rhythms to adapt the resonance frequency to external sources of rhythmic stimulation [14] and P300 wave-based BCIs operating on the principle of detecting the P300 wave of the induced potentials in response to the stimuli (symbols) intended by the user [15, 16]. The accuracy of detecting commands in such BCIs amounts to 95–97% [17], but at the same time, the operator’s attention should be always focused on the matrix stimulus environment. Examples of applying resonance frequency BCIs for controlling lower-limb robotic exoskeletons (REs) are MindWalker [11], the Korean lower-limb exoskeleton [18].

Unlike synchronous BCIs, in asynchronous motor imagery based BCI technology (MI–BCI — motor imagery based brain–computer interface), detected as commands for controlling EEG rhythm changes are invoked by the arbitrary efforts of the human operator irrespective of any external sensory stimulation. Despite the fact that this BCI type is quite difficult to master compared to resonance frequency and P300 wave-based BCIs, exactly motor imagery BCI is considered to be the most promising for training motor function [1, 19, 20]. The operating principle of motor imagery based BCI is detecting desynchronization of sensorimotor rhythms in the motor area of the cerebral cortex contralateral to the motor act when the operator elicits motor act imageries such as grasping, moving the fingers and other movements [21]. It has been shown that even patients with paresis of the extremities are able to successfully imagine various movements of paralyzed body parts [22]. BCI-RoGO [23] and NeuroRex [24] are the examples of using such BCI for controlling exoskeletons.

It is important to underline that recently there have been published successful results of the study on the rehabilitation of people with paraplegia caused by spinal cord lesions using motor imagery based BCI technology [25]. The rehabilitation protocol included comprehensive training with controlling the virtual reality scenario, lower-limb exoskeleton, tactile feedback and locomotor activity through BCI technology.

The aim of the investigation was to develop neuro-integrated control system for a lower-limb robotic exoskeleton based on recognition of EEG patterns using motor imagery brain–computer interface technology.

Materials and Methods. The experimental study involved 14 healthy subjects (6 males and 8 females aged 18 to 23 years) who were informed about the terms and conditions of the experiment before it started and provided written informed consent for participation. The study protocol was approved by the Bioethics Committee of Lobachevsky State University of Nizhni Novgorod. The experiments were carried out without wearing the exoskeleton by the operator. During the experiment, the exoskeleton was on the stand near the operator and performed movements with the left and right foot depending on the EEG pattern generated by the operator (See Media https://drive.google.com/file/d/0BzOUw8ncip3md1dPdzZmZmtka3c/view).

The developed neuro-integrated RE control system based on BCI technology consists of three main modules: EEG signal recording module, EEG signal classifier and the software for transmission of commands to RE.

EEG signal recording module. EEG signals were recorded using the certified NVX 52 amplifier (LLC “Medical Computer Systems”, Russia). Eight leads were used to record EEG (FCz, C5, C3, C1, Cz, C2, C4, C6), arranged according to the international 10-10 scheme (Figure 1). The reference electrode was placed on the left ear lobe. The grounding electrode was on the forehead. The signal digitization frequency was 500 Hz. Resistance under the electrodes did not exceed 10 kΩ. The signal was filtered in the range of 6 to 15 Hz with Notch filter of 50 Hz.


gordleeva-fig-1.jpg Figure 1. The diagram of training and testing the classifier in the contour of motor imagery brain–computer interface; layout diagram of the electrodes used in the experiment

EEG signal classifier. The experimental procedure of operating the BCI-controlled RE consisted of three consecutive sessions: the training session, the test session, and controlling session. Training and testing sessions were used for the initial setup and testing the classifier, respectively.

During the classifier training, the operator performed one of three commands: rest when the monitor image of the cross appeared, left or right hand motor imagery movements when “left/right” arrow was seen on the monitor. The operator was asked to choose any hand movement considered comfortable for imagery. The exemplary movements were finger movements and rotating a hand in the wrist joint. The “rest” command meant that the operator had to sit still concentrating on breathing. Each 5-second command was presented 10 times. The inter-stimulus interval was 3 s (empty screen). The stimuli were presented randomly. The classifier training duration amounted to 4 min. While presenting the stimuli, EEG was recorded using NeoRec software (LLC “Medical Computer Systems”, Russia) which transmitted the received signal according to the LSL protocol. The transmitted signal was read as a script written in the Python language (www.python.org) which controlled the synchronization between the stimuli presentation and the EEG signal.

To control the success of mastering the motor imagery technique, the degree of sensorimotor rhythm desynchronization was assessed. When the stimuli presentation was completed, spatial filtering was applied to improve the obtained EEG recording in all the channels (Surface Laplacian method [26]). Afterwards, the power in the frequency range from 6 to 15 Hz with a step of 1 Hz was calculated for each data set corresponding to the stimulus type in each channel separately. The rate of power change relative to “rest” was calculated for the records corresponding to motor imagery. The results were mapped (Figure 2). When spectral power decreased (desynchronization) by more than 50% during motor imagery, the operator was considered to have successfully mastered motor imagery technique and proceeded to the test session of the classifier. In case of several failed attempts, the procedure was repeated with the change of motor imagery type.


gordleeva-fig-2.jpg

Figure 2. Exemplary topographic mapping of EEG desynchronization degree for one operator:

(a) left hand motor imagery movement; (b) right hand motor imagery movement; the maps are presented for desynchronization at 13 Hz frequency; the dark color corresponds to the maximum de-synchronization, the light color means the minimum de-synchronization

When testing the classifier, the results of mental task recognition were provided to the operator by visual feedback: a green scale beginning at the circle in the screen center where the subject fixed the eyes filled down to the edge of the screen if the classifier recognized the task in agreement with the given command and the scale stopped filling when another task was recognized (See Figure 1).

Recognition was performed by the classifier based on linear discriminant analysis using the features identified by spatial filtering with CSP method [27] for all types of commands pairwise. The solutions of paired classifiers were summarized by “voting”.

After the classifier training and testing, the operator could start controlling the external device, i.e. the lower-limb RE [28]. The operator was asked to choose one of three commands (left or right hand motor imagery movements and rest). The classifier analyzed EEG recording every 4.5 s, made a conclusion and transmitted the command chosen by the operator to the external device.

The software for transmission of commands to robotic exoskeleton. Figure 3 shows the software operation diagram.


gordleeva-fig-3.jpg Figure 3. The diagram of robotic exoskeleton control system based on motor imagery brain–computer interface

BCI classifier software and RE control system software are installed on a personal computer running Windows operating system. Web service controlling the exoskeleton is run on the microcomputer built in it and powered by Linux-compatible operating system. Data transfer between the computer and the exoskeleton is carried out through a wireless Wi-Fi channel.

RE control system based on motor imagery BCI works as follows. As described above, EEG operator data are transferred from NVX 52 amplifier to the classifier in the LSL format. Every 4.5 s the number of the pattern recognized by the classifier is placed in the User Datagram Protocol (UDP) and sent via Transmission Control Protocol (TCP)/Internet Protocol (IP), to RE control software. For easy system customization, the authors have developed original RE control software. The software has a graphical user interface and allows the user to change the mapping of RE patterns and commands in the operation process (Figure 4). The program is written in C++ using the Qt library (https://www.qt.io/).


gordleeva-fig-4.jpg Figure 4. Graphical user interface of robotic exoskeleton control software

Using a pre-set correspondence between the pattern number and exoskeleton command, RE control software sends a request to the web service of the exoskeleton to perform the relevant command. The web service of the exoskeleton is a program run on the BeagleBoard-xM microprocessor board coming with Angstrom Linux Distribution. The web service interface is implemented using the gSOAP code generator (https://www.cs.fsu.edu/~engelen/soap.html) and provides the ability to run movement patterns of the exoskeleton limbs. A total of 10 patterns have been implemented (stand up; sit down; bend/extend the right leg; bend/extend the left leg; vertical; embryo; walk). Each of the patterns implements the algorithm of synchronous control over a group of 4 actuators. Actuator parameters are the limiting angles at which the movement in the corresponding joints is terminated and the maximum torque provided on the actuator shafts of the joints when movement is performed.

The following associations of patterns and commands were used for the exoskeleton:

pattern 1 (left limb motor imagery movement) — flexion/extension of the left leg of the exoskeleton;

pattern 2 (right limb motor imagery movement) — flexion/extension of the right leg of the exoskeleton.

Results. The results of the operator’s work were used to calculate the average accuracy of the skeleton control based on the given BCI as the ratio of the total number of correctly entered commands to the total number of attempts. Each operator was given 10 attempts for each command in each session.

The results of testing on 14 volunteers (See the Table) showed that the average control accuracy of the developed motor-imagery BCI is 73, 71, and 66%, respectively, for three commands in the sessions. Evidently, the average accuracy of choosing commands does not vary greatly in different sessions. It is also interesting to note that individual indices of accuracy in testing the classifier predict (correlation coefficient =0.8) further assessment of skeleton operation accuracy. Notably, some subjects (about one third) definitely achieved fairly high accuracy (90% and more).


gordleeva-table.jpg Classification accuracy (%) in three sessions of brain–computer interface operation (sessions 1 and 2 — testing the classifier; session 3 — operating the exoskeleton)

In future, we plan to test the developed control system in experiments with wearing the exoskeleton by healthy people and people with severe motor disabilities and to develop individual operator training techniques to achieve higher accuracy of control.

Conclusion. The developed control system for a robotic exoskeleton (a lower-limb exoskeleton) based on motor imagery brain–computer interface offers rather high accuracy for three commands. As a robotic rehabilitation device, it can be used for active electromechanical assistance to movements in patients with severe motor disabilities, for development and training of impaired postural functions in patients with disorders affecting the functioning of motor centers and neural signal conductivity of the spinal cord with clinical presentation of partial or complete paraplegia (stroke, traumatic brain injury, spinal cord injury, cerebral palsy and other diseases).

Study Funding. The study was supported by the grant of the Russian Science Foundation (project No.15-19-20053).

Conflicts of Interest. The authors have no conflicts of interest to disclose.


References

  1. Kaplan A.Ya. Neurophysiological foundations and practical realizations of the brain–machine interfaces in the technology in neurological rehabilitation. Human Physiology 2016; 42(1): 103–110, https://doi.org/10.1134/s0362119716010102.
  2. Fedotchev А.I., Parin S.B., Polevaya S.A., Velikova S.D. Brain–computer interface and neurofeedback technologies: current state, problems and clinical prospects (review). Sovremennye tehnologii v medicine 2017; 9(1): 175–184, https://doi.org/10.17691/stm2017.9.1.22.
  3. Nudo R.J., Milliken G.W., Jenkins W.M., Merzenich M.M. Use-dependent alterations of movement representations in primary motor cortex of adult squirrel monkeys. J Neurosci 1996; 16(2): 785–807.
  4. Bach-Y-Rita P. Theoretical and practical considerations in the restoration of function after stroke. Top Stroke Rehabil 2001; 8(3): 1–15, https://doi.org/10.1310/8t1t-etxu-8pdf-9x7f.
  5. Taub E., Uswatte G., Elbert T. New treatments in neurorehabilitation founded on basic research. Nat Rev Neurosci 2002; 3(3): 228–236, https://doi.org/10.1038/nrn754.
  6. Kwakkel G., Wagenaar R.C., Twisk J.W., Lankhorst G.J., Koetsier J.C. Intensity of leg and arm training after primary middle-cerebral-artery stroke: a randomised trial. Lancet 1999; 354(9174): 191–196, https://doi.org/10.1016/s0140-6736(98)09477-x.
  7. Marchal-Crespo L., Reinkensmeyer D.J. Review of control strategies for robotic movement training after neurologic injury. J Neuroeng Rehabil 2009; 6(1): 20, https://doi.org/10.1186/1743-0003-6-20.
  8. Frolov A.A., Biryukova E.V., Bobrov P.D., Mokienko O.A., Platonov A.K., Pryanichnikov V.E., Chernikova L.A. Principles of neurorehabilitation based on the brain–computer interface and biologically adequate control of the exoskeleton. Human Physiology 2013; 39(2): 196–208, https://doi.org/10.1134/s0362119713020035.
  9. Van der Kooij H., Koopman B., van Asseldonk E.H.F. Body weight support by virtual model control of an impedance controlled exoskeleton (LOPES) for gait training. Conf Proc IEEE Eng Med Biol Soc 2008; 2008: 1969–1972, https://doi.org/10.1109/iembs.2008.4649574.
  10. Banala S.K., Seok H.K., Agrawal S.K., Scholz J.P. Robot assisted gait training with active leg exoskeleton (ALEX). IEEE Trans Neural Syst Rehabil Eng 2009; 17(1): 2–8, https://doi.org/10.1109/tnsre.2008.2008280.
  11. Wang S., Wang L., Meijneke C., van Asseldonk E., Hoellinger T., Cheron G., Ivanenko Y., La Scaleia V., Sylos-Labini F., Molinari M., Tamburella F., Pisotta I., Thorsteinsson F., Ilzkovitz M., Gancet J., Nevatia Y., Hauffe R., Zanow F., van der Kooij H. Design and control of the MINDWALKER exoskeleton. IEEE Trans Neural Syst Rehabil Eng 2015; 23(2): 277–286, https://doi.org/10.1109/tnsre.2014.2365697.
  12. Kaplan A.Ya., Lim J.J., Jin K.S., Park B.W., Byeon J.G., Tarasova S.U. Unconscious operant conditioning in the paradigm of brain-computer interface based on color perception. Int J Neurosci 2005; 115(6): 781–802, https://doi.org/10.1080/00207450590881975.
  13. Wolpaw J.R., Birbaumer N., McFarland D.J., Pfurtscheller G., Vaughan T.M. Brain–computer interfaces for communication and control. Clin Neurophysiol 2002; 113(6): 767–791, https://doi.org/10.1016/s1388-2457(02)00057-3.
  14. Kelly S.P., Lalor E.C., Finucane C., McDarby G., Reilly R.B. Visual spatial attention control in an independent brain–computer interface. IEEE Trans Biomed Eng 2005; 52(9): 1588–1596, https://doi.org/10.1109/tbme.2005.851510.
  15. Kaplan A.Ya., Zhigulskaya D.D., Kirjanov D.A. Studying the ability to control human phantom fingers in P300 brain-computer interface. Bulletin of Russian State Medical University 2016; 2: 24–28.
  16. Farwell L.A., Donchin E. Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. Electroencephalogr Clin Neurophysiol 1988; 70(6): 510–523, https://doi.org/10.1016/0013-4694(88)90149-6.
  17. Brunner P., Bianchi L., Guger C., Cincotti F., Schalk G. Current trends in hardware and software for brain–computer interfaces (BCIs). J Neural Eng 2011; 8(2): 025001, https://doi.org/10.1088/1741-2560/8/2/025001.
  18. Kwak N.S., Müller K.R., Lee S.W. A lower limb exoskeleton control system based on steady state visual evoked potentials. J Neural Eng 2015; 12(5): 056009, https://doi.org/10.1088/1741-2560/12/5/056009.
  19. Kaplan A., Vasilyev A., Liburkina S., Yakovlev L. Poor BCI performers still could benefit from motor imagery training. In: Schmorrow D., Fidopiastis C. (editors). Foundations of augmented cognition: neuroergonomics and operational neuroscience. AC 2016. Lecture notes in computer science. Vol 9743. Springer International Publishing, Cham; 2016; p. 46–56, https://doi.org/10.1007/978-3-319-39955-3_5.
  20. Mulder Th. Motor imagery and action observation: cognitive tools for rehabilitation. J Neural Transm (Vienna) 2007; 114(10): 1265–1278, https://doi.org/10.1007/s00702-007-0763-z.
  21. Vasilyev A.N., Liburkina S.P., Kaplan A.Ya. Lateralization of EEG рatterns in humans during motor imagery of arm movements in the brain–computer interface. Zhurnal vysshey nervnoy deyatel’nosti im. I.P. Pavlova 2016; 66(3): 302–312, https://doi.org/10.7868/s0044467716030126.
  22. De Vries S., Tepper M., Feenstra W., Oosterveld H., Boonstra A.M., Otten B. Motor imagery ability in stroke patients: the relationship between implicit and explicit motor imagery measures. Front Hum Neurosci 2013; 7: 790, https://doi.org/10.3389/fnhum.2013.00790.
  23. Do A.H., Wang P.T., King C.E., Chun S.N., Nenadic Z. Brain-computer interface controlled robotic gait orthosis. J Neuroeng Rehabil 2013; 10(1): 111, https://doi.org/10.1186/1743-0003-10-111.
  24. Contreras-Vidal J.L., Grossman R.G. NeuroRex: a clinical neural interface roadmap for EEG-based brain machine interfaces to a lower body robotic exoskeleton. Conf Proc IEEE Eng Med Biol Soc 2013; 2013: 1579–1582, https://doi.org/10.1109/embc.2013.6609816.
  25. Donati A.R., Shokur S., Morya E., Campos D.S., Moioli R.C., Gitti C.M., Augusto P.B., Tripodi S., Pires C.G., Pereira G.A., Brasil F.L., Gallo S., Lin A.A., Takigami A.K., Aratanha M.A., Joshi S., Bleuler H., Cheng G., Rudolph A., Nicolelis M.A. Long-term training with a brain-machine interface-based gait protocol induces partial neurological recovery in paraplegic patients. Sci Rep 2016; 6: 30383, https://doi.org/10.1038/srep30383.
  26. Hjort B. An on-line transformation of EEG scalp potentials into orthogonal source derivations. Electroencephalogr Clin Neurophysiol 1975; 39(5): 526–530, https://doi.org/10.1016/0013-4694(75)90056-5.
  27. Koles Z.J., Lazar M.S., Zhou S.Z. Spatial patterns underlying population differences in the background EEG. Brain Topogr 1990; 2(4): 275–284, https://doi.org/10.1007/bf01129656.
  28. Mineev S.A., Novikov V.A., Kuzmina I.V., Shatalin R.A., Grin I.V. Goniometric sensor interface for exoskeleton system control device. Biomed Eng 2016; 49(6): 357–361, https://doi.org/10.1007/s10527-016-9566-6.


Journal in Databases

pubmed_logo.jpg

web_of_science.jpg

scopus.jpg

crossref.jpg

ebsco.jpg

embase.jpg

ulrich.jpg

cyberleninka.jpg

e-library.jpg

lan.jpg

ajd.jpg

SCImago Journal & Country Rank