Convolutional Neural Network Based Electroencephalogram Controlled Robotic Arm

Citation

Lim, Zheng You and Neo, Yong Quan (2021) Convolutional Neural Network Based Electroencephalogram Controlled Robotic Arm. In: 2021 IEEE International Conference on Automatic Control & Intelligent Systems (I2CACIS), 26-26 June 2021, Shah Alam, Malaysia.

[img] Text
Convolutional Neural Network Based Electroencephalogram....pdf
Restricted to Repository staff only

Download (3MB)

Abstract

In this paper, we present a six-degree of freedom (DOF) robotic arm that can be directly controlled by brainwaves, also known as electroencephalogram (EEG) signals. The EEG signals are acquired using an open-source device known as OpenBCI Ultracortex Mark IV Headset. In this research, inverse kinematics is implemented to simplify the controlling method of the robotic into 8 commands for the end-effector: forward, backward, upward, downward, left, right, open and close. A deep learning method namely convolutional neural network (CNN) which constructed using Python programming language is used to classify the EEG signals into 8 mental commands. The recall rate and precision of the 8 mental command classification using the CNN model in this research are up to 91.9% and 92%. The average inference time for the system is 1.5 seconds. Hence, this research offers a breakthrough technology that allows disabled persons for example paralyzed patients and upper limbs amputees to control a robotic arm to handle their daily life tasks.

Item Type: Conference or Workshop Item (Paper)
Uncontrolled Keywords: Kinematics
Subjects: Q Science > QA Mathematics > QA801-939 Analytic mechanics
Divisions: Faculty of Engineering and Technology (FET)
Depositing User: Ms Nurul Iqtiani Ahmad
Date Deposited: 30 Aug 2021 05:31
Last Modified: 30 Aug 2021 05:31
URII: http://shdl.mmu.edu.my/id/eprint/9470

Downloads

Downloads per month over past year

View ItemEdit (login required)