A LabVIEW Based Brain-Computer Interface Application for Controlling a Virtual Robotic Arm Using the P300 Evoked Biopotentials and the EEG Bandpower Rhythms Acquired from the GTEC Unicorn Headset
Închide
Articolul precedent
Articolul urmator
188 0
SM ISO690:2012
RUȘANU, Oana Andreea. A LabVIEW Based Brain-Computer Interface Application for Controlling a Virtual Robotic Arm Using the P300 Evoked Biopotentials and the EEG Bandpower Rhythms Acquired from the GTEC Unicorn Headset. In: IFMBE Proceedings: . 6th International Conference on Nanotechnologies and Biomedical Engineering , Ed. 6, 20-23 septembrie 2023, Chişinău. Chişinău: Springer Science and Business Media Deutschland GmbH, 2023, Ediția 6, Vol.92, pp. 103-112. ISBN 978-303142781-7. ISSN 16800737. DOI: https://doi.org/10.1007/978-3-031-42782-4_12
EXPORT metadate:
Google Scholar
Crossref
CERIF

DataCite
Dublin Core
IFMBE Proceedings
Ediția 6, Vol.92, 2023
Conferința "6th International Conference on Nanotechnologies and Biomedical Engineering"
6, Chişinău, Moldova, 20-23 septembrie 2023

A LabVIEW Based Brain-Computer Interface Application for Controlling a Virtual Robotic Arm Using the P300 Evoked Biopotentials and the EEG Bandpower Rhythms Acquired from the GTEC Unicorn Headset

DOI:https://doi.org/10.1007/978-3-031-42782-4_12

Pag. 103-112

Rușanu Oana Andreea
 
Transilvania University of Brașov
 
 
Disponibil în IBN: 31 octombrie 2023


Rezumat

The brain-computer interface is a high technology inspired from science fiction with a strong impact for helping people with neuromotor disabilities suffering from complete paralysis. Leveraging the power of thoughts translated into processing and classifying the EEG based signals acquired from the brain result in controlling the mechatronic systems, such as robotic arms or smart wheelchairs aimed at the medical assistance of disabled persons. A robotic arm is necessary for grasping and moving different objects according to the users’ intention. Also, a simulation based on a virtual robotic arm can support the real experimentation of a complex and expensive physical robotic arm. This paper presents a prototype of a simple brain-computer interface implemented in LabVIEW programming environment for controlling a virtual robotic arm using the commands determined by the P300 evoked biopotentials and the EEG rhythms using the GTEC Unicorn headset and the related official applications. The integration between the LabVIEW proposed instrument and the Unicorn user interfaces is facilitated by the UDP data transfer. The P300 speller human faces board is associated with the generation of the commands necessary to animate specific joints from the structure of the virtual robotic arm. The EEG data frequencies (delta, theta, alpha, beta, gamma) are mapped to the angle values of each joint (shoulder, elbow, wrist) composing the 3D robotic arm. The purpose of the proposed application is to train people how to use a brain-computer interface. It also shows the possibility of integrating the LabVIEW development environment with the Unicorn EEG technology by using the UDP transfer. 

Cuvinte-cheie
Brain-Computer Interface, LabView, Unicorn Headset