1. |
Naseer N, Hong K S. fNIRS-based brain-computer interfaces: a review. Front Hum Neurosci, 2015, 9: 172.
|
2. |
Eastmond C, Subedi A, De S, et al. Deep learning in fNIRS: a review. Neurophotonics, 2022, 9(4): 041411.
|
3. |
Phillips V Z, Canoy R J, Paik S H, et al. Functional near-infrared spectroscopy as a personalized digital healthcare tool for brain monitoring. J Clin Neurol, 2023, 19(2): 115-124.
|
4. |
Zhang Y, Wang Bingyuan, Gao F. Real-time decoding for fNIRS-based brain computer interface using adaptive Gaussian mixture model classifier and Kalman estimator//2018 Asia Communications and Photonics Conference (ACP), HangZhou: IEEE, 2018: 1-4.
|
5. |
白璐, 张耀, 刘东远, 等. 高灵敏度多通道fNIRS系统的BCI应用: “肯定/否定”二分类意图识别. 中国激光, 2022, 49(5): 0507209.
|
6. |
Hong K S, Ghafoor U, Khan M J. Brain-machine interfaces using functional near-infrared spectroscopy: a review. Artif Life Robotics, 2020, 25(2): 204-218.
|
7. |
Pan Y, Dikker S, Goldstein P, et al. Instructor-learner brain coupling discriminates between instructional approaches and predicts learning. Neuroimage, 2020, 211: 116657.
|
8. |
Kwon J, Im C H. Subject-independent functional near-infrared spectroscopy-based brain-computer interfaces based on convolutional neural networks. Front Hum Neurosci, 2021, 15: 646915.
|
9. |
Zhang Y, Liu D, Zhang P, et al. Combining robust level extraction and unsupervised adaptive classification for high-accuracy fNIRS-BCI: An evidence on single-trial differentiation between mentally arithmetic- and singing-tasks. Front Neurosci, 2022, 16: 938518.
|
10. |
Kwon O Y, Lee M H, Guan C T, et al. Subject-independent brain-computer interfaces based on deep convolutional neural networks. IEEE Trans Neural Netw Learn Syst, 2020, 31(10): 3839-3852.
|
11. |
Li J, Wang F, Huang H, et al. A novel semi-supervised meta learning method for subject-transfer brain-computer interface. Neural Netw, 2023, 163: 195-204.
|
12. |
Dolzhikova I, Abibullaev B, Sameni R, et al. Subject-independent classification of motor imagery tasks in EEG using multisubject ensemble CNN. IEEE Access, 2022, 10: 81355-81363.
|
13. |
Jin L, Kim E Y. Interpretable cross-subject EEG-based emotion recognition using channel-wise features. Sensors, 2020, 20(23): 6719.
|
14. |
Zhang Y, Liu D, Li T, et al. CGAN-rIRN: a data-augmented deep learning approach to accurate classification of mental tasks for a fNIRS-based brain-computer interface. Biomed Opt Express, 2023, 14(6): 2934-2954.
|
15. |
Raizada R D, Connolly A C. What makes different people's representations alike: neural similarity space solves the problem of across-subject fMRI decoding. J Cogn Neurosci, 2012, 24(4): 868-877.
|
16. |
Lee A, Särkkä A, Madhyastha T M, et al. Characterizing cross-subject spatial interaction patterns in functional magnetic resonance imaging studies: A two-stage point-process model. Biom J, 2017, 59(6): 1352-1381.
|
17. |
Ruan Y, Du M, Ni T. Transfer discriminative dictionary pair learning approach for across-subject EEG emotion classification. Front Psychol, 2022, 13: 899983.
|
18. |
Wang L M, Huang Y H, Chou P H, et al. Characteristics of brain connectivity during verbal fluency test: convolutional neural network for functional near-infrared spectroscopy analysis. J Biophotonics, 2022, 15(1): e202100180.
|
19. |
Zhang Y, Zhang X, Sun H, et al. Portable brain-computer interface based on novel convolutional neural network. Comput Biol Med, 2019, 107: 248-256.
|
20. |
Huang C, Xiao Y, Xu G. Predicting human intention-behavior through EEG signal analysis using multi-scale CNN. IEEE/ACM Trans Comput Biol Bioinform, 2021, 18(5): 1722-1729.
|
21. |
Wen H, Shi J, Chen W, et al. Transferring and generalizing deep-learning-based neural encoding models across subjects. Neuroimage, 2018, 176: 152-163.
|
22. |
Khalil K, Asgher U, Ayaz Y. Novel fNIRS study on homogeneous symmetric feature-based transfer learning for brain-computer interface. Sci Rep, 2022, 12(1): 3198.
|
23. |
Hiwa S, Hanawa K, Tamura R, et al. Analyzing brain functions by subject classification of functional near-infrared spectroscopy data using convolutional neural networks analysis. Comput Intell Neurosci, 2016, 2016: 1841945.
|
24. |
Yoo S H, Woo S W, Amad Z. Classification of three categories from prefrontal cortex using LSTM networks: fNIRS study//2018 18th International Conference on Control, Automation and Systems (ICCAS), South Korea: IEEE, 2018: 1141-1146.
|
25. |
Hennrich J, Herff C, Heger D, et al. Investigating deep learning for fNIRS based BCI//2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan: IEEE, 2015: 2844-2847.
|
26. |
刘飞, 陈仁文, 邢凯玲, 等. 基于迁移学习与深度残差网络的滚动轴承快速故障诊断算法. 振动与冲击, 2022, 41(3): 154-164.
|
27. |
Chowdhury A K, Tjondronegoro D, Chandran V, et al. Physical activity recognition using posterior-adapted class-based fusion of multiaccelerometer data. IEEE J Biomed Health Inform, 2018, 22(3): 678-685.
|
28. |
刘洋, 刘东远, 张耀, 等. 面向脑机接口应用的便携式fNIRS拓扑成像系统: 全并行检测与初步范式实验. 中国激光, 2021, 48(11): 149-157.
|
29. |
Pan T, Wang B, Liu D, et al. A three-wavelength, 240-channel NIRS-DOT system of lock-in photon-counting mode for brain functional investigation//Conference on Optical Tomography and Spectroscopy of Tissue XIII, San Francisco: SPIE, 2019, 10874: 108741U.
|
30. |
Zhao H, Gao F, Tanikawa Y, et al. Imaging of in vitro chicken leg using time-resolved near-infrared optical tomography. Phys Med Biol, 2002, 47(11): 1979-1993.
|
31. |
Duan L, Zhao Z, Lin Y, et al. Wavelet-based method for removing global physiological noise in functional near-infrared spectroscopy. Biomed Opt Express, 2018, 9(8): 3805-3820.
|
32. |
Nguyen H D, Yoo S H, Bhutta M R, et al. Adaptive filtering of physiological noises in fNIRS data. Biomed Eng Online, 2018, 17(1): 180.
|
33. |
Ghonchi H, Fateh M, Abolghasemi V, et al. Deep recurrent-convolutional neural network for classification of simultaneous EEG-fNIRS signals. IET Signal Process, 2020, 14(3): 142-153.
|
34. |
Li Z, Jiang Y H, Duan L, et al. A Gaussian mixture model based adaptive classifier for fNIRS brain-computer interfaces and its testing via simulation. J Neural Eng, 2017, 14(4): 046014.
|