Real-Time, Multi-Command Drone Navigation Using a Consumer-Grade EEG-Based SSVEP BCI
Downloads
Steady-state visual evoked potential (SSVEP) brain-computer interfaces (BCIs) provide a non-invasive method for hands-free device control. However, their practical applications are limited by reliance on costly laboratory-grade electroencephalography (EEG) systems. This study addresses this gap by designing and evaluating a real-time, six-command SSVEP-BCI for drone navigation using a consumer-grade EEG headset. An adaptive processing pipeline was developed to extract spectral and spatial features, which were classified using Support Vector Machine (SVM), Random Forest (RF), and Artificial Neural Network (ANN) models. Analysis of data from 30 participants revealed that the RF classifier achieved an optimal balance between performance and speed, with a high classification accuracy of 87.24% and a low computational latency of 0.09 seconds, resulting in a high information transfer rate (ITR) of 35.0 bits/min. In contrast, the ANN was insufficiently accurate, and SVM performance was marginal. These findings demonstrate the viability of low-cost, multi-command SSVEP-BCIs for applications in assistive technology, teleoperation, and human-computer interaction.
[1] A. Craik, Y. He, and J. L. Contreras-Vidal, “Deep learning for electroencephalogram (EEG) classification tasks: a review,” J. Neural Eng., vol. 16, no. 3, p. 031001, Jun. 2019, doi: 10.1088/1741-2552/ab0ab5.
[2] I. Volosyak, “SSVEP based Bremen-BCI interface-boosting information transfer rates,” J. Neural Eng., vol. 8, no. 3, p. 036020, Jun. 2011, doi: 10.1088/1741-2560/8/3/036020.
[3] D. Kapgate, “Efficient Quadcopter Flight Control Using Hybrid SSVEP + P300 Visual Brain Computer Interface,” Int. J. Human-Computer Interact., vol. 38, no. 1, pp. 42-52, Jan. 2022, doi: 10.1080/10447318.2021.1921482.
[4] M.-A. Chung, C.-W. Lin, and C.-T. Chang, “The Human-Unmanned Aerial Vehicle System Based on SSVEP-Brain Computer Interface,” Electronics, vol. 10, no. 23, p. 3025, Dec. 2021, doi: 10.3390/electronics10233025.
[5] J. Sabio, N. S. Williams, G. M. McArthur, and N. A. Badcock, “A scoping review on the use of consumer-grade EEG devices for research,” PLoS One, vol. 19, no. 3, p. e0291186, Mar. 2024, doi: 10.1371/journal.pone.0291186.
[6] N. S. Williams, G. M. McArthur, and N. A. Badcock, “It’s all about time: precision and accuracy of Emotiv event-marking for ERP research,” PeerJ, vol. 9, p. e10700, Feb. 2021, doi: 10.7717/peerj.10700.
[7] X. Chen, Y. Wang, S. Gao, T.-P. Jung, and X. Gao, “Filter bank canonical correlation analysis for implementing a high-speed SSVEP based brain-computer interface,” J. Neural Eng., vol. 12, no. 4, p. 046008, Aug. 2015, doi: 10.1088/1741-2560/12/4/046008.
[8] F. Lotte et al., “A review of classification algorithms for EEG-based brain-computer interfaces: a 10 year update,” J. Neural Eng., vol. 15, no. 3, p. 031005, Jun. 2018, doi: 10.1088/1741-2552/aab2f2.
[9] F. J. Ramírez-Arias et al., “Evaluation of Machine Learning Algorithms for Classification of EEG Signals,” Technologies, vol. 10, no. 4, p. 79, Jun. 2022, doi: 10.3390/technologies10040079.
[10] Y. Qin, W. Zhang, and X. Tao, “TBEEG: A Two-Branch Manifold Domain Enhanced Transformer Algorithm for Learning EEG Decoding,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 32, pp. 1445-1455, 2024, doi: 10.1109/TNSRE.2024.3380595.
[11] S.-J. Kim, D.-H. Lee, H.-G. Kwak, and S.-W. Lee, “Toward Domain-Free Transformer for Generalized EEG Pre-Training,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 32, pp. 482-492, 2024, doi: 10.1109/TNSRE.2024.3355434.
[12] D. D. Kapgate, “Application of hybrid SSVEP + P300 brain computer interface to control avatar movement in mobile virtual reality gaming environment,” Behav. Brain Res., vol. 472, p. 115154, Aug. 2024, doi: 10.1016/j.bbr.2024.115154.
[13] I. Semenkov, N. Fedosov, I. Makarov, and A. Ossadtchi, “Real-time low latency estimation of brain rhythms with deep neural networks,” J. Neural Eng., vol. 20, no. 5, p. 056008, Oct. 2023, doi: 10.1088/1741-2552/acf7f3.
[14] M. TajDini, V. Sokolov, I. Kuzminykh, S. Shiaeles, and B. Ghita, “Wireless Sensors for Brain Activity-A Survey,” Electronics, vol. 9, no. 12, p. 2092, Dec. 2020, doi: 10.3390/electronics9122092.
[15] Z. Razzaq, N. Brahimi, H. Z. U. Rehman, and Z. H. Khan, “Intelligent Control System for Brain controlled Mobile Robot Using Self-Learning Neuro-Fuzzy Approach,” Sensors, vol. 24, no. 18, p. 5875, Sep. 2024, doi: 10.3390/s24185875.
[16] M. Nakanishi, Y. Wang, Y.-T. Wang, and T.-P. Jung, “A Comparison Study of Canonical Correlation Analysis Based Methods for Detecting Steady state Visual Evoked Potentials,” PLoS One, vol. 10, no. 10, p. e0140703, Oct. 2015, doi: 10.1371/journal.pone.0140703.
[17] K. Kotowski, K. Stapor, J. Leski, and M. Kotas, “Validation of Emotiv EPOC+ for extracting ERP correlates of emotional face processing,” Biocybern. Biomed. Eng., vol. 38, no. 4, pp. 773-781, 2018, doi: 10.1016/j.bbe.2018.06.006.
[18] A. Delorme and S. Makeig, “EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis,” J. Neurosci. Methods, vol. 134, no. 1, pp. 9-21, Mar. 2004, doi: 10.1016/j.jneumeth.2003.10.009.
[19] H. Liu et al., “A comparative study of stereo-dependent SSVEP targets and their impact on VR-BCI performance,” Front. Neurosci., vol. 18, Apr. 2024, doi: 10.3389/fnins.2024.1367932.
[20] N. Bigdely-Shamlo, T. Mullen, C. Kothe, K.-M. Su, and K. A. Robbins, “The PREP pipeline: standardized preprocessing for large scale EEG analysis,” Front. Neuroinform., vol. 9, Jun. 2015, doi: 10.3389/fninf.2015.00016.
[21] H.-J. Ahn, D.-H. Lee, J.-H. Jeong, and S.-W. Lee, “Multiscale Convolutional Transformer for EEG Classification of Mental Imagery in Different Modalities,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 31, pp. 646-656, 2023, doi: 10.1109/TNSRE.2022.3229330.
[22] T. Shen, L. Zhang, S. Yan, and Y. Hu, “An active and passive upper limb rehabilitation training system based on a hybrid brain-computer interface,” J. Integr. Des. Process Sci., vol. 26, no. 1, pp. 71-84, Jan. 2023, doi: 10.3233/JID-220001.
[23] T. Ma et al., “CNN-based classification of fNIRS signals in motor imagery BCI system,” J. Neural Eng., vol. 18, no. 5, p. 056019, Oct. 2021, doi: 10.1088/1741-2552/abf187.
[24] C. Cortes and V. Vapnik, “Support-vector networks,” Mach. Learn., vol. 20, no. 3, pp. 273-297, Sep. 1995, doi: 10.1007/BF00994018.
[25] I. Goodfellow, Y. Bengio, and A. Courville, Deep Learning. MIT Press, 2016.
[26] A. Zhang, Z. C. Lipton, M. Li, and A. J. Smola, Softmax Regression BT - Dive into Deep Learning. 2023.
[27] J. Yin, A. Liu, C. Li, R. Qian, and X. Chen, “A GAN Guided Parallel CNN and Transformer Network for EEG Denoising,” IEEE J. Biomed. Heal. Informatics, vol. 29, no. 6, pp. 3930-3941, Jun. 2025, doi: 10.1109/JBHI.2023.3277596.
[28] B. Obermaier, C. Neuper, C. Guger, and G. Pfurtscheller, “Information transfer rate in a five-classes brain-computer interface,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 9, no. 3, pp. 283-288, Sep. 2001, doi: 10.1109/7333.948456.
[29] W. Ding, A. Liu, L. Guan, and X. Chen, “A Novel Data Augmentation Approach Using Mask Encoding for Deep Learning-Based Asynchronous SSVEP-BCI,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 32, pp. 875-886, 2024, doi: 10.1109/TNSRE.2024.3366930.
[30] J. Demšar, “Statistical Comparisons of Classifiers over Multiple Data Sets,” J. Mach. Learn. Res., vol. 7, no. 1, pp. 1-30, 2006, [Online]. Available: http://jmlr.org/papers/v7/demsar06a.html.
[31] Y. Jiang, M.-L. T. Lee, X. He, B. Rosner, and J. Yan, “Wilcoxon Rank-Based Tests for Clustered Data with R Package clusrank,” J. Stat. Softw., vol. 96, no. 6, 2020, doi: 10.18637/jss.v096.i06.
[32] I. Martišius and R. Damaševičius, “A Prototype SSVEP Based Real Time BCI Gaming System,” Comput. Intell. Neurosci., vol. 2016, pp. 1-15, 2016, doi: 10.1155/2016/3861425.
[33] V. Asanza et al., “SSVEP-EEG Signal Classification based on Emotiv EPOC BCI and Raspberry Pi,” IFAC-PapersOnLine, vol. 54, no. 15, pp. 388-393, 2021, doi: 10.1016/j.ifacol.2021.10.287.
[34] M. Kołodziej, A. Majkowski, R. J. Rak, and P. Wiszniewski, “Convolutional Neural Network-Based Classification of Steady state Visually Evoked Potentials with Limited Training Data,” Appl. Sci., vol. 13, no. 24, p. 13350, Dec. 2023, doi: 10.3390/app132413350.
[35] Z. Wang, C. M. Wong, B. Wang, Z. Feng, F. Cong, and F. Wan, “Compact Artificial Neural Network Based on Task Attention for Individual SSVEP Recognition With Less Calibration,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 31, pp. 2525-2534, 2023, doi: 10.1109/TNSRE.2023.3276745.
[36] K. B. Ng, A. P. Bradley, and R. Cunnington, “Stimulus specificity of a Steady state visual-evoked potential-based brain-computer interface,” J. Neural Eng., vol. 9, no. 3, p. 036008, Jun. 2012, doi: 10.1088/1741-2560/9/3/036008.
[37] K. Liu et al., “MSVTNet: Multi-Scale Vision Transformer Neural Network for EEG-Based Motor Imagery Decoding,” IEEE J. Biomed. Heal. Informatics, vol. 28, no. 12, pp. 7126-7137, Dec. 2024, doi: 10.1109/JBHI.2024.3450753.
[38] X. Zheng et al., “Task Transfer Learning for EEG Classification in Motor Imagery-Based BCI System,” Comput. Math. Methods Med., vol. 2020, pp. 1-11, Dec. 2020, doi: 10.1155/2020/6056383.
[39] K. Zhang, N. Robinson, S.-W. Lee, and C. Guan, “Adaptive transfer learning for EEG motor imagery classification with deep Convolutional Neural Network,” Neural Networks, vol. 136, pp. 1-10, Apr. 2021, doi: 10.1016/j.neunet.2020.12.013.
[40] Y.-C. Chen, Y.-M. Chang, and M.-H. Li, “An Electric Wheelchair Manipulating System Using SSVEP based BCI System,” Biosensors, vol. 12, no. 10, p. 772, 2022, doi: 10.3390/bios12100772.
[41] S. Ketabi, S. Rashidi, and A. Fallah, “Text-dependent speaker verification using discrete wavelet transform based on linear prediction coding,” Biomed. Signal Process. Control, vol. 86, p. 105218, 2023, doi: https://doi.org/10.1016/j.bspc.2023.105218.
[42] A. M. A. Ahmed, Y. Al-Junaidi, A. Al-Tayar, A. Qaid, and K. K. Qureshi, “An SSVEP based Brain-Computer Interface Device for Wheelchair Control Integrated with a Speech Aid System,” Eng, vol. 6, no. 12, 2025, doi: 10.3390/eng6120343.
[43] C.-T. Chang, K.-J. Pai, M.-A. Chung, and C.-W. Lin, “A Feasibility Study on Enhanced Mobility and Comfort: Wheelchairs Empowered by SSVEP BCI for Instant Noise Cancellation and Signal Processing in Assistive Technology,” Electronics, vol. 14, no. 21, 2025, doi: 10.3390/electronics14214338.
[44] Y. Wang, X. Chen, X. Gao, and S. Gao, “A Benchmark Dataset for SSVEP based Brain-Computer Interfaces,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 25, no. 10, pp. 1746-1752, Oct. 2017, doi: 10.1109/TNSRE.2016.2627556.
[45] M. Lee, H.-Y. Park, W. Park, K.-T. Kim, Y.-H. Kim, and J.-H. Jeong, “Multi-Task Heterogeneous Ensemble Learning-Based Cross-Subject EEG Classification Under Stroke Patients,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 32, pp. 1767-1778, 2024, doi: 10.1109/TNSRE.2024.3395133.
[46] Y. Cai, Q. She, J. Ji, Y. Ma, J. Zhang, and Y. Zhang, “Motor imagery EEG decoding using manifold embedded transfer learning,” J. Neurosci. Methods, vol. 370, p. 109489, Mar. 2022, doi: 10.1016/j.jneumeth.2022.109489.
[47] J. Jiang, Z. Zhou, E. Yin, Y. Yu, and D. Hu, “Hybrid Brain-Computer Interface (BCI) based on the EEG and EOG signals,” Biomed. Mater. Eng., vol. 24, no. 6, pp. 2919-2925, Nov. 2014, doi: 10.3233/BME-141111.
[48] H. Wang, Y. Li, J. Long, T. Yu, and Z. Gu, “An asynchronous wheelchair control by hybrid EEG-EOG brain-computer interface,” Cogn. Neurodyn., vol. 8, no. 5, pp. 399-409, Oct. 2014, doi: 10.1007/s11571-014-9296-y.
[49] Q. Huang, Z. Zhang, T. Yu, S. He, and Y. Li, “An EEG-/EOG-Based Hybrid Brain-Computer Interface: Application on Controlling an Integrated Wheelchair Robotic Arm System,” Front. Neurosci., vol. 13, Nov. 2019, doi: 10.3389/fnins.2019.01243.
Copyright (c) 2025 Anderias Eko Wijaya, Nurizati Nurizati, Rian Hermawan, Muhammad Agung Suhendra (Author)

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution-ShareAlikel 4.0 International (CC BY-SA 4.0) that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).





