Project

Real-time multiway decoding of performed, observed and imagined movement electrocorticography supported by avatar-based user training

Code
3G0A4118
Duration
01 January 2018 → 31 December 2021
Funding
Research Foundation - Flanders (FWO)
Research disciplines
No data available
Keywords
Brain-Computer Interfaces
 
Project description

Brain-Computer Interfaces (BCIs) decode brain activity with the aim to establish a communication

channel that does not rely on muscular control. BCIs usually rely on EEG signals acquired from the

subject's scalp or on electrophysiological signals from brain implants. The latter yield a superior

decoding performance, but as the implant damages the cortical tissue, long-term signal stability is a

concern. EEG does not require surgery but the information throughput of the BCIs is quite low and

their suitability for independent daily use rather limited.

Electrocorticography (ECoG) offers new perspectives for BCI. Albeit accuracy, long-term stability and

independent use have already been shown, what is still lacking is the ability to control finger or hand

movements from the patient's self-paced but imagined counterparts. This also summarizes the main

objective of this project.

But motor imagery is a skill that needs to be learnt. To meet this challenge, we will: 1) provide

natural feedback to the patient by showing a hand "avatar" controlled by motor imagery; and 2)

complement the decoder by what was learnt under observed movement but adjusted by the

patient's ability to achieve control. This requires a new type of self-paced ECoG decoder: one that is

able to assist and keep up with subject training. We will develop an iterative tensor-based decoder

with automatic rank selection to maximize correctness while limiting computational and storage

cost to ensure on-line performance.