A. J. Scott, T. L. Webb, M. Martyn-St James, G. Rowse, and S. Weich,
“Improving sleep quality leads to better mental health: A meta-analysis of randomised controlled trials,” Sleep Med. Rev., vol. 60, p. 101556, Dec. 2021, doi: 10.1016/j.smrv.2021.101556.
F. Mendonga, S. S. Mostafa, F. Morgado-Dias, A. G. Ravelo-Garcia, and T. Penzel, “A review of approaches for sleep quality analysis,” Ieee Access, vol.
7, pp. 24527-24546, 2019.
J.-S. Wang, G.-R. Shih, and W.-C. Chiang, “Sleep stage classification of sleep apnea patients using decision-tree-based support vector machines based on ECG parameters,” presented at the Proceedings of 2012 IEEE-EMBS International Conference on Biomedical and Health Informatics, IEEE, 2012,
pp. 285-288.
E. Rajbhandari, A. Alsadoon, P. Prasad, I. Seher, T. Q. V. Nguyen, and D. T.
H. Pham, “A novel solution of enhanced loss function using deep learning in sleep stage classification: predict and diagnose patients with sleep disorders,” Multimed. Tools Appl., vol. 80, pp. 11607-11630, 2021.
A. Bandyopadhyay and C. Goldstein, “Clinical applications of artificial intelligence in sleep medicine: a sleep clinician’s perspective,” Sleep Breath., vol. 27, no. 1, pp. 39-55, 2023.
A. J. Boe et al., “Automating sleep stage classification using wireless, wearable sensors,” NPJ Digit. Med., vol. 2, no. 1, p. 131, 2019.
M. J. Sateia, “International classification of sleep disorders,” Chest, vol. 146,
no. 5, pp. 1387-1394, 2014.
D. Moser et al., “Sleep classification according to AASM and Rechtschaffen & Kales: effects on sleep scoring parameters,” Sleep, vol. 32, no. 2, pp. 139-149, Feb. 2009, doi: 10.1093/sleep/32.2.139.
Y.-L. Hsu, Y.-T. Yang, J.-S. Wang, and C.-Y. Hsu, “Automatic sleep stage recurrent neural classifier using energy features of EEG signals,” Neurocomputing, vol. 104, pp. 105-114, 2013.
[10] R. B. Berry et al., “AASM scoring manual updates for 2017 (version 2.4),” J.
Clin. Sleep Med., vol. 13, no. 5, pp. 665-666, 2017.
{11] A. Supratak, H. Dong, C. Wu, and Y. Guo, “DeepSleepNet: A model for
automatic sleep stage scoring based on raw single-channel EEG,” [EEE Trans. Neural Syst. Rehabil. Eng., vol. 25, no. 11, pp. 1998-2008, 2017.
[12] A. Supratak and Y. Guo, “TinySleepNet: An efficient deep learning model for
sleep stage scoring based on raw single-channel EEG,” presented at the 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), IEEE, 2020, pp. 641-644.
[13] E. Eldele et al., “An Attention-Based Deep Learning Approach for Sleep Stage
Classification With Single-Channel EEG,” JEEE Trans. Neural Syst. Rehabil. Eng., vol. 29, pp. 809-818, 2021, doi: 10.1109/TNSRE.2021.3076234.
86
[14] E. Eldele et ai, “Time-Series Representation Learning via Temporal and
Contextual Contrasting,” Aug. 2021, pp. 2352-2359. doi:
10.24963/ijcai.202 1/324.
[15] E. Eldele et al., Self-supervised Contrastive Representation Learning for Semi-
supervised Time-Series Classification. 2022. doi: 10.48550/arXiv.2208.06616. [16] P. Abhang, “Chapter 2-Technological Basics of EEG Recording and Operation
of Apparatus (PA Abhang, BW Gawali, & SCBT-I. to ES-BER Mehrotra,” 2016.
[17] G. M. Rojas, C. Alvarez, C. E. Montoya, M. De la Iglesia-Vaya, J. E.
Cisternas, and M. Galvez, “Study of resting-state functional connectivity networks using EEG electrodes position as seed,” Front. Neurosci., vol. 12, p.
235, 2018.
[18] A. K. Patel, V. Reddy, and J. F. Araujo, “Physiology, sleep stages,” in
StatPearls [Internet], StatPearls Publishing, 2022.
[19] I. Feinberg and T. Floyd, “Systematic trends across the night in human sleep
cycles,” Psychophysiology, vol. 16, no. 3, pp. 283-291, 1979.
[20] L. Fraiwan, K. Lweesy, N. Khasawnch, H. Wenz, and H. Dickhaus,
“Automated sleep stage identification system based on time-frequency analysis of a single EEG channel and random forest classifier,’ Comput. Methods Programs Biomed., vol. 108, no. 1, pp. 10-19, 2012.
[21] B. Koley and D. Dey, “An ensemble system for automatic sleep stage
classification using single channel EEG signal,” Comput. Biol. Med., vol. 42,
no. 12, pp. 1186-1195, 2012.
[221 A. Smith, H. Anand, S. Milosavljevic, K. M. Rentschler, A. Pocivavsek, and
H. Valafar, “Application of Machine Learning to sleep stage classification,” presented at the 2021 International Conference on Computational Science and Computational Intelligence (CSCI), IEEE, 2021, pp. 349-354.
[23] J. Johnson, “What’s a Deep Neural Network? Deep Nets Explained,” BMC
Blogs. https://www.bmc.com/blogs/deep-neural-network/ (accessed Jun. 13, 2023).
[24] F. Ebrahimi, M. Mikaeili, E. Estrada, and H. Nazeran, “Automatic sleep stage
classification based on EEG signals by using neural networks and wavelet packet coefficients,” presented at the 2008 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, IEEE,
2008, pp. 1151-1154.
[25] H. Phan, F. Andreotti, N. Cooray, O. Y. Chén, and M. De Vos, “Joint
classification and prediction CNN framework for automatic sleep stage classification,” JEEE Trans. Biomed. Eng., vol. 66, no. 5, pp. 1285-1296, 2018.
[26] N. Michielli, U. R. Acharya, and F. Molinari, “Cascaded LSTM recurrent
neural network for automated sleep stage classification using single-channel EEG signals,” Comput. Biol. Med., vol. 106, pp. 71-81, 2019.
87
[27] C. Li, Y. Qi, X. Ding, J. Zhao, T. Sang, and M. Lee, “A deep learning method
approach for sleep stage classification with eeg spectrogram,” Int. J. Environ. Res. Public. Health, vol. 19, no. 10, p. 6322, 2022.
[28] “Mixed Neural Network Approach for Temporal Sleep Stage Classification.”
https://ieeexplore.ieee.org/abstract/document/7995122/ (accessed Jun. 06, 2023).
[29] C. O”reilly, N. Gosselin, J. Carrier, and T. Nielsen, “Montreal Archive of Sleep
Studies: an open-access resource for instrument benchmarking and exploratory research,” J. Sleep Res., vol. 23, no. 6, pp. 628-635, 2014.
[30] B. Kemp, A. Zwinderman, B. Tuk, H. Kamphuisen, and J. Oberyé, “Sleep-
EDF Database Expanded,” Physionet Org, 2018.
[31] K. Simonyan and A. Zisserman, “Very deep convolutional networks for large-
scale image recognition,” ArXiv Prepr. ArXiv14091556, 2014.
(32] L. Taylor and G. Nitschke, “Improving deep learning with generic data
augmentation,” presented at the 2018 IEEE symposium series on computational intelligence (SSCD, IEEE, 2018, pp. 1542-1547.
[33] M. A. Gordon and K. Duh, “Explaining sequence-level knowledge distillation
as data-augmentation for neural machine translation,’ ArXiv Prepr. ArXiv191203334, 2019.
[34] D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” ArXiv
Prepr. ArXiv14126980, 2014.
[35] R. Mueller, “Sleep Data - National Sleep Research Resource - NSRR.”
https://sleepdata.org/ (accessed Jun. 23, 2023).
[36] J. Y. Cheng, H. Goh, K. Dogrusoz, O. Tuzel, and E. Azemi, “Subject-aware
contrastive learning for biosignals,” ArXiv Prepr. ArXiv200704871, 2020.
[37] D. Anguita, A. Ghio, L. Oneto, X. Parra, and J. L. Reyes-Ortiz, “A public
domain dataset for human activity recognition using smartphones.,” presented
at the Esann, 2013, p. 3.
[28] R. G. Andrzejak, K. Lehnertz, F. Mormann, C. Rieke, P. David, and C. E.
Elger, “Indications of nonlinear deterministic and finite-dimensional structures
in time series of brain electrical activity: Dependence on recording region and brain state,” Phys. Rev. E, vol. 64, no. 6, p. 061907, 2001.
[39] C. Lessmeier, J. K. Kimotho, D. Zimmer, and W. Sextro, “Condition
monitoring of bearing damage in electromechanical drive systems by using motor current signals of electric motors: A benchmark data set for data-driven classification,” presented at the PHM Society European Conference, 2016.
[40] D. Moser et al., “Sleep classification according to AASM and Rechtschaffen &
Kales: effects on sleep scoring parameters,” Sleep, vol. 32, no. 2, pp. 139-149, 2009.
[41] O. Tsinalis, P. M. Matthews, and Y. Guo, “Automatic sleep stage scoring using
time-frequency analysis and stacked sparse autoencoders,” Ann. Biomed. Eng., vol. 44, pp. 1587-1597, 2016.
88
[42] A. Schlemmer, U. Parlitz, S. Luther, N. Wessel, and T. Penzel, “Changes of
sleep-stage transitions due to ageing and sleep disorder,” Philos. Trans. R. Soc. Math. Phys. Eng. Sci., vol. 373, no. 2034, p. 20140093, 2015.
[43] M. Hirshkowitz et al., “National Sleep Foundation’s updated sleep duration
recommendations,” Sleep Health, vol. 1, no. 4, pp. 233-243, 2015.
[44] J. Akosa, “Predictive accuracy: A misleading performance measure for highly
imbalanced data,” presented at the Proceedings of the SAS global forum, 2017,
pp. 1+.
[45] Y. Tang, Y.-Q. Zhang, N. V. Chawla, and S. Krasser, “SVMs modeling for
highly imbalanced classification,” JEEE Trans. Syst. Man Cybern. Part B Cybern., vol. 39, no. 1, pp. 281-288, 2008.
[46] J. Malik, Y.-L. Lo, and H. Wu, “Sleep-wake classification via quantifying
heart rate variability by convolutional neural network,” Physiol. Meas., vol. 39,
no. 8, p. 085004, 2018.
[47] A. Nguyen, K. Pham, D. Ngo, T. Ngo, and L. Pham, “An analysis of state-of-
the-art activation functions for supervised deep neural network,” presented at the 2021 International Conference on System Science and Engineering (ICSSE), IEEE, 2021, pp. 215-220.
[48] D. Hendrycks and K. Gimpel, “Gaussian error linear units (gelus),” ArXiv
Prepr. ArXiv160608415, 2016.
[49] M. Lee, “GELU Activation Function in Deep Learning: A Comprehensive
Mathematical Analysis and Performance,” ArXiv Prepr. ArXiv230512073, 2023.
[50] M. Urefia-Pliego, R. Martinez-Marin, B. González-Rodrigo, and M.
Marchamalo-Sacristan, “Automatic Building Height Estimation: Machine Learning Models for Urban Image Analysis,” Appl. Sci., vol. 13, no. 8, p. 5037,
2023.
[51] “Welcome to Flask — Flask Documentation (1.1.x).”
https://flask.palletsprojects.com/en/1.1.x/ (accessed Jun. 15, 2023).
[52] “Chart.js | Chart.js.” https://www.chartjs.org/docs/latest/ (accessed Jun. 26,
2023).
[53] “Files :: Anaconda.org.” https://anaconda.org/conda-
forge/conda/files?version=4.5.11 (accessed Jun. 26, 2023).
[54] “Papers with Code - Delving Deep into Rectifiers: Surpassing Human-Level
Performance on ImageNet Classification.”
https://paperswithcode.com/paper/delving-deep-into-rectifiers-surpassing- human (accessed Jul. 08, 2023).
(55] J. Xu, Z. Li, B. Du, M. Zhang, and J. Liu, “Reluplex made more practical:
Leaky ReLU,” presented at the 2020 IEEE Symposium on Computers and communications (ISCC), IEEE, 2020, pp. 1-7.
89
PHỤ LỤC
Các thử nghiệm với các hàm kích hoạt
Trong quá trình phát triển mô hình và huấn luyện, việc lựa chọn hàm kích hoạt đóng vai trò quan trọng trong việc cải thiện hiệu suất và độ chính xác của mô hình. Nhóm đã tiễn hành một loạt các thử nghiệm dé tìm hiểu và so sánh hiệu quả của các
hàm kích hoạt khác nhau.
Với mục tiêu tìm kiếm sự cải tiến, nhóm đã thử nghiệm cải tiến với ba hàm kích hoạt là GELU [49], PReLU [54] và LeakyReLU [55]. Để đánh giá hiệu quả của từng hàm kích hoạt, nhóm sử dụng các số liệu và biéu đồ dé so sánh độ chính xác và các độ đo khác nhau trong quá trình huấn luyện và kiểm tra. Kết quả của quá
trình thử nghiệm, so sánh và đánh giá các hàm kích hoạt được trình bày dưới đây
Bang PL.1: Bảng thống kê kết qua thử nghiệm cải tiến với các hàm kích hoạt
Accuracy
ReLU PReLU LeakyReLU GELU
TinySleepNet 81.88% 81.18% 80.33% 82.04%
TS-TCC 70.26% 70.75% 68.48% 71.22%
CA-TCC 68.66% 65.12% 70.78% 71.58%
WAvg. F1 ReLU PReLU LeakyReLU GELU TinySleepNet 81.97% 80.89% 80.68% 81.81%
TS-TCC 68.01% 69.51% 68.94% 71.06%
CA-TCC 69.35% 63.91% 69.92% 70.88%
WAvg. Gm ReLU PReLU LeakyReLU GELU TinySleepNet 87.56% 86.86% 87.01% 87.33%
TS-TCC 75.84% 76.74% 78.73% 79.56%
CA-TCC 78.25% 72.83% 77.34% 79.11%
90
SO SÁNH ACCURACY CUA CÁC PHƯƠNG PHAP
VỚI CAC HAM KÍCH HOẠT
85.00% 81.88%1 19% 82.04%0,
80.33%
80.00%
75.00%
70.26% 70.75% 71.22% 70.78 71.58%
70.00% 68.48% 68.66%
65.12%
65.00%
60.00%
#ReLU M@PReLU #Leaky ReLU #GELU
Hình PL.1: Biéu đồ so sánh Accuracy của các phương pháp khi sử dụng ReLU, GELU,
PReLU và LeakyReLU
SO SÁNH W.AVG F1 CUA CÁC PHƯƠNG PHÁP
VỚI CÁC HÀM KÍCH HOẠT
85.00% g1 g7ứ 80.89% gọ,eg% 81-81%
80.00%
on 71.06% 70.88
70.00% Be Ot 08-94% , 69.35% — 69.92% Ề
. ° 5 ©
65.00% 63.91%
60.00%
mReLU mPReLU # Leaky ReLU mGELU
Hình PL. 2: Biểu đồ so sánh W.Avg F1 của các phương pháp khi sử dụng ReLU, GELU,
PReLU và LeakyReLU
91