Refine
Has Fulltext
- yes (6)
Is part of the Bibliography
- yes (6)
Document Type
- Journal article (6) (remove)
Language
- English (6) (remove)
Keywords
- P300 (2)
- communication (2)
- Brain-Computer Interface (BCI) (1)
- Brain-computer interface (1)
- Dynamic stopping (1)
- ERPS (1)
- Event-related potentials (1)
- TO-target interval (1)
- Tactile (1)
- Wheelchair (1)
- amplitude (1)
- assistive technology (1)
- attentional impulsivity (1)
- auditory (1)
- brain-computer interface (1)
- brain-computer interfaces (1)
- brain-computer-interface (1)
- computer software (1)
- end-user testing (1)
- event related potentials (1)
- games (1)
- head (1)
- head injury (1)
- locked-in syndrome (1)
- man-computer interface (1)
- motor-impaired end-user (1)
- paradigm (1)
- predictors (1)
- prototypes (1)
- recognition (1)
- sensorimotor rhythms (1)
- speech (1)
- systems (1)
- tactile auditory and visual modality (1)
- user-centered design (1)
- visual-evoked potentials (1)
- visuo-motor coordination abilities (1)
Institute
The objective of this study was to test the usability of a new auditory Brain-Computer Interface (BCI) application for communication. We introduce a word based, intuitive auditory spelling paradigm the WIN-speller. In the WIN-speller letters are grouped by words, such as the word KLANG representing the letters A, G, K, L, and N. Thereby, the decoding step between perceiving a code and translating it to the stimuli it represents becomes superfluous. We tested 11 healthy volunteers and four end-users with motor impairment in the copy spelling mode. Spelling was successful with an average accuracy of 84% in the healthy sample. Three of the end-users communicated with average accuracies of 80% or higher while one user was not able to communicate reliably. Even though further evaluation is required, the WIN-speller represents a potential alternative for BCI based communication in end-users.
Albeit research on brain-computer interfaces (BCI) for controlling applications has expanded tremendously, we still face a translational gap when bringing BCI to end-users. To bridge this gap, we adapted the user-centered design (UCD) to BCI research and development which implies a shift from focusing on single aspects, such as accuracy and information transfer rate (ITR), to a more holistic user experience. The UCD implements an iterative process between end-users and developers based on a valid evaluation procedure. Within the UCD framework usability of a device can be defined with regard to its effectiveness, efficiency, and satisfaction. We operationalized these aspects to evaluate BCI-controlled applications. Effectiveness was regarded equivalent to accuracy of selections and efficiency to the amount of information transferred per time unit and the effort invested (workload). Satisfaction was assessed with questionnaires and visual-analogue scales. These metrics have been successfully applied to several BCI-controlled applications for communication and entertainment, which were evaluated by end-users with severe motor impairment. Results of four studies, involving a total of N = 19 end-users revealed: effectiveness was moderate to high; efficiency in terms of ITR was low to high and workload low to medium; depending on the match between user and technology, and type of application satisfaction was moderate to high. The here suggested evaluation metrics within the framework of the UCD proved to be an applicable and informative approach to evaluate BCI controlled applications, and end-users with severe impairment and in the locked-in state were able to participate in this process.
Modulation of sensorimotor rhythms (SMR) was suggested as a control signal for brain-computer interfaces (BCI). Yet, there is a population of users estimated between 10 to 50% not able to achieve reliable control and only about 20% of users achieve high (80–100%) performance. Predicting performance prior to BCI use would facilitate selection of the most feasible system for an individual, thus constitute a practical benefit for the user, and increase our knowledge about the correlates of BCI control. In a recent study, we predicted SMR-BCI performance from psychological variables that were assessed prior to the BCI sessions and BCI control was supported with machine-learning techniques. We described two significant psychological predictors, namely the visuo-motor coordination ability and the ability to concentrate on the task. The purpose of the current study was to replicate these results thereby validating these predictors within a neurofeedback based SMR-BCI that involved no machine learning.Thirty-three healthy BCI novices participated in a calibration session and three further neurofeedback training sessions. Two variables were related with mean SMR-BCI performance: (1) a measure for the accuracy of fine motor skills, i.e., a trade for a person’s visuo-motor control ability; and (2) subject’s “attentional impulsivity”. In a linear regression they accounted for almost 20% in variance of SMR-BCI performance, but predictor (1) failed significance. Nevertheless, on the basis of our prior regression model for sensorimotor control ability we could predict current SMR-BCI performance with an average prediction error of M = 12.07%. In more than 50% of the participants, the prediction error was smaller than 10%. Hence, psychological variables played a moderate role in predicting SMR-BCI performance in a neurofeedback approach that involved no machine learning. Future studies are needed to further consolidate (or reject) the present predictors.
Background
People with severe disabilities, e.g. due to neurodegenerative disease, depend on technology that allows for accurate wheelchair control. For those who cannot operate a wheelchair with a joystick, brain-computer interfaces (BCI) may offer a valuable option. Technology depending on visual or auditory input may not be feasible as these modalities are dedicated to processing of environmental stimuli (e.g. recognition of obstacles, ambient noise). Herein we thus validated the feasibility of a BCI based on tactually-evoked event-related potentials (ERP) for wheelchair control. Furthermore, we investigated use of a dynamic stopping method to improve speed of the tactile BCI system.
Methods
Positions of four tactile stimulators represented navigation directions (left thigh: move left; right thigh: move right; abdomen: move forward; lower neck: move backward) and N = 15 participants delivered navigation commands by focusing their attention on the desired tactile stimulus in an oddball-paradigm.
Results
Participants navigated a virtual wheelchair through a building and eleven participants successfully completed the task of reaching 4 checkpoints in the building. The virtual wheelchair was equipped with simulated shared-control sensors (collision avoidance), yet these sensors were rarely needed.
Conclusion
We conclude that most participants achieved tactile ERP-BCI control sufficient to reliably operate a wheelchair and dynamic stopping was of high value for tactile ERP classification. Finally, this paper discusses feasibility of tactile ERPs for BCI based wheelchair control.
This paper describes a case study with a patient in the classic locked-in state, who currently has no means of independent communication. Following a user-centered approach, we investigated event-related potentials (ERP) elicited in different modalities for use in brain-computer interface (BCI) systems. Such systems could provide her with an alternative communication channel. To investigate the most viable modality for achieving BCI based communication, classic oddball paradigms (1 rare and 1 frequent stimulus, ratio 1:5) in the visual, auditory and tactile modality were conducted (2 runs per modality). Classifiers were built on one run and tested offline on another run (and vice versa). In these paradigms, the tactile modality was clearly superior to other modalities, displaying high offline accuracy even when classification was performed on single trials only. Consequently, we tested the tactile paradigm online and the patient successfully selected targets without any error. Furthermore, we investigated use of the visual or tactile modality for different BCI systems with more than two selection options. In the visual modality, several BCI paradigms were tested offline. Neither matrix-based nor so-called gaze-independent paradigms constituted a means of control. These results may thus question the gaze-independence of current gaze-independent approaches to BCI. A tactile four-choice BCI resulted in high offline classification accuracies. Yet, online use raised various issues. Although performance was clearly above chance, practical daily life use appeared unlikely when compared to other communication approaches (e.g., partner scanning). Our results emphasize the need for user-centered design in BCI development including identification of the best stimulus modality for a particular user. Finally, the paper discusses feasibility of EEG-based BCI systems for patients in classic locked-in state and compares BCI to other AT solutions that we also tested during the study.
Background: One of the most common types of brain-computer interfaces (BCIs) is called a P300 BCI, since it relies on the P300 and other event-related potentials (ERPs). In the canonical P300 BCI approach, items on a monitor flash briefly to elicit the necessary ERPs. Very recent work has shown that this approach may yield lower performance than alternate paradigms in which the items do not flash but instead change in other ways, such as moving, changing colour or changing to characters overlaid with faces.
Methodology/Principal Findings: The present study sought to extend this research direction by parametrically comparing different ways to change items in a P300 BCI. Healthy subjects used a P300 BCI across six different conditions. Three conditions were similar to our prior work, providing the first direct comparison of characters flashing, moving, and changing to faces. Three new conditions also explored facial motion and emotional expression. The six conditions were compared across objective measures such as classification accuracy and bit rate as well as subjective measures such as perceived difficulty. In line with recent studies, our results indicated that the character flash condition resulted in the lowest accuracy and bit rate. All four face conditions (mean accuracy >91%) yielded significantly better performance than the flash condition (mean accuracy = 75%).
Conclusions/Significance: Objective results reaffirmed that the face paradigm is superior to the canonical flash approach that has dominated P300 BCIs for over 20 years. The subjective reports indicated that the conditions that yielded better performance were not considered especially burdensome. Therefore, although further work is needed to identify which face paradigm is best, it is clear that the canonical flash approach should be replaced with a face paradigm when aiming at increasing bit rate. However, the face paradigm has to be further explored with practical applications particularly with locked-in patients.