Responsive image
博碩士論文 etd-0704120-110227 詳細資訊
Title page for etd-0704120-110227
論文名稱
Title
應用卷積神經網路之人體姿態辨識
Human Gesture Recognition using Convolutional Neural Network
系所名稱
Department
畢業學年期
Year, semester
語文別
Language
學位類別
Degree
頁數
Number of pages
66
研究生
Author
指導教授
Advisor
召集委員
Convenor
口試委員
Advisory Committee
口試日期
Date of Exam
2020-07-21
繳交日期
Date of Submission
2020-08-04
關鍵字
Keywords
深度學習、紅外線、卷積神經網路、跌倒、熱影像
thermal image, deep learning, infrared, convolutional neural network, falling down
統計
Statistics
本論文已被瀏覽 5736 次,被下載 0
The thesis/dissertation has been browsed 5736 times, has been downloaded 0 times.
中文摘要
在醫療進步的現代,年長者的比例越來越高。而子女忙於工作,白天時無法待在長輩身旁,因此會發生年長者獨自在家或是獨居時跌倒等意外。跌倒可能會造成重傷甚至死亡,因此偵測年長者跌倒意外並即時發現有助於減少無人發現而延誤就醫的風險。近年來,深度學習(deep learning)在影像辨識領域表現非常出色,其中,卷積神經網路(CNNs)更是將深度學習的應用性及知名度大為增加。本論文將以深度學習中的卷積神經網路為基礎,辨識人體在床邊、床上的狀態並加以分類。不同於一般需要接觸受測者的感測器,例如:穿戴式傳感器、壓力床墊。本論文以紅外線熱影像儀器蒐集熱影像的方式,不會受到光線的影響,在沒有光線的環境依然可以使用;且受測者也不會忘記攜帶裝置。本論文針對了人體在床邊的四種狀態進行分類,將熱影像作為輸入訓練卷積神經網路。總計蒐集170組,37648張熱影像資料。並且在上下床跌倒的部分以多種不同動作模擬。目的在增加訓練資料的豐富性,希望能夠盡可能模擬出日常生活中各種跌倒的情況,藉以增加程式的判斷準確率。最後得到的驗證準確率為95.9%。
Abstract
In the modern era of medical advancement. The proportion of seniors is getting higher and higher. The youngers are busy with work, they are unable to stay beside their elders during the day. So, there will be accidents such as the elderly falling at home when living alone. Falling may cause serious injury or even death. Therefore, detecting the fall accident of the elderly and discovering in time can help reduce the risk of no one finds it and delaying medical treatment. In recent years, deep learning has performed very well in the field of image recognition. Among them, convolutional neural networks (CNNs) have greatly increased the applicability and popularity of deep learning. The thesis will be based on convolutional neural networks in deep learning to identify and classify the state of the human body beside and at the bed. Unlike the sensors that generally need to contact human, such as wearable sensors and pressure mattresses. This paper uses infrared thermal imaging equipment to collect thermal images which are not being affected by light. It can still be used in an environment without any light. In addition, human will not forget to carry the device. This paper classifies the four states of the human body beside and at the bed. Use the thermal image as input to train the CNN. A total of 170 sets of 37648 thermal image data collected. Simulate a variety of different movements in the fall part to increase the richness of the training data. Hope to simulate various kinds of falls in daily life, so as to increase the judgment accuracy of the program. The final verification accuracy rate is 95.9%.
目次 Table of Contents
目錄
論文審定書 i
論文公開授權書 ii
致謝 iii
摘要 iv
Abstract v
目錄 vi
圖次 viii
表次 x
第一章 序論 1
1.1研究動機 1
1.2睡眠動作檢測方法 1
1.3熱影像介紹 5
1.3.1紅外線原理 5
1.3.2感測器種類 7
1.3.3熱影像應用 8
1.4研究目的與章節規劃 10
1.4.1 研究目的 10
1.4.2 章節規劃 11
第二章 實驗架構介紹 12
2.1硬體介紹 12
2.1.1 Heat Finder介紹 13
2.1.2保護層介紹 14
2.1.3 Lepton系統架構介紹 15
2.1.4 Lepton傳輸方式 17
2.2軟體介紹 21
2.3熱影像實驗 24
2.3.1棉被實驗 24
2.3.2床墊餘溫實驗 25
第三章 卷積神經網路架構介紹 28
3.1卷積神經網路介紹 28
3.1.1卷積層 29
3.1.2池化層 30
3.1.3全連接層 31
3.1.4 損失層 33
3.2架構介紹 35
3.2.1影像預處理 35
3.2.2CNN架構 36
第四章 實驗設計與結果 39
4.1實驗流程 39
4.1.1資料蒐集 39
4.1.2資料處理 41
4.2實驗結果 43
4.2.1訓練與測試 43
4.2.2驗證 46
4.3文獻比較 46
第五章 結論與未來展望 49
5.1結論與未來展望 49
參考文獻 51
參考文獻 References
參考文獻
[1] S. Deandrea, E. Lucenteforte, F. Bravi, R. Foschi, C. La Vecchia, and E. Negri, “Risk factors for falls in community-dwelling older people : a systematic review and meta-analysis,” Epidemiology, vol. 21, no. 5, pp. 658-668, Sep. 2010.
[2] M.E. Tinetti and C. Kumar, “The patient who falls : It's always a trade-off,” JAMA, vol. 303, no. 3, pp. 258-266, Jan. 2010.
[3] K. Murata, M. Ishii, H. Ito, and K. Dohsaka, “Pose Detection Method of Bed-Leaving Behavior with RGB-D Camera,” in Proc. 2nd IEEE Conf. Life Sci. and Technol., Kyoto, Japan, pp. 356-357, Mar. 2020.
[4] N. Noury, A. Fleury, P. Rumeau, A.K. Bourke, G. O Laighin, V. Rialle, and J.E. Lundy, “Fall detection-principles and methods,” in Proc. 29th Annu. Int. IEEE Conf. Eng. Med. Biol. Soc., Lyon, France, pp. 1663-1666, Oct. 2007.
[5] A.K. Bourke, P.W.J. van de Ven, A.E. Chaya, G.M. OLaighin, and J. Nelson, “Testing of a long-term fall detection system incorporated into a custom vest for the elderly,” in Proc. 30th Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., Vancouver, Canada, pp. 2844-2847, Oct. 2008.
[6] G. Wu and S. Xue, “Portable preimpact fall detector with inertial sensors,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 16, no. 2, pp. 178-183, Apr. 2008.
[7] M. Prado-Velasco, M.G. del Rio-Cidoncha, and R. Ortiz-Marin, “The inescapable smart impact detection system (isis) : An ubiquitous and personalized fall detector based on a distributed divide and conquer strategy,” in Proc. 30th Annu. Int. IEEE Conf. Eng. Med. Biol. Soc., Vancouver, Canada, pp. 3332-3335, Oct. 2008.
[8] S.M. Mohammadi, M. Alnowami, S. Khan, D. Dijk, A. Hilton, and K. Wells, “Sleep Posture Classification using a Convolutional Neural Network,” In Proc. 40th Annu. Int. IEEE Conf. Eng. Med. Biol. Soc., Honolulu, H.I., U.S.A, pp. 1-4, Oct. 2018.
[9] H. Yoon et al., “Estimation of sleep posture using a patch-type accelerometer based device,” in Proc. IEEE 37th Annu. Int. Conf. Eng. Med. Biol. Soc., Milan, Italy, pp. 4942-4945, Nov. 2015.
[10] J. Liu, W. Xu, M.C. Huang, N. Alshurafa, and M. Sarrafzadeh, “A dense pressure sensitive bedsheet design for unobtrusive sleep posture monitoring,” in Proc. IEEE Int. Conf. Pervas. Comput. Commun., San Diego, CA, U.S.A, pp. 1-9, 2013.
[11] K. Higashi, G. Sun, and K. Ishibashi, “Precise Heart Rate Measurement Using Non-contact Doppler Radar Assisted by Machine-Learning-Based Sleep Posture Estimation,” in Proc. IEEE 41st Annu. Int. Conf. Eng. Med. Biol. Soc., Berlin, Germany, pp. 788-791, Oct. 2019.
[12] Range Imaging, WIKI, [Online]. Available:
https://en.wikipedia.org/wiki/Range_imaging
[13] Structured-light 3D scanner, WIKI, [Online]. Available:
https://en.wikipedia.org/wiki/Structured-light_3D_scanner
[14] Time of Flight, WIKI, [Online]. Available:
https://en.wikipedia.org/wiki/Time of Flight
[15] Thermographic camera, WIKI, [Online]. Available:
https://en.wikipedia.org/wiki/Thermographic camera
[16] G. Kastberger and R. Stachl, “ Infrared imaging technology and biological applications,” Behay. Res. Methods Instrum. Comput., vol. 35, no. 3, pp. 429-439, 2003.
[17] R. Gade and T.B. Moeslund, “Thermal cameras and applications: a survey,” Machine Vision and Applications, vol. 25, no. 1, pp. 245-262, 2014.
[18] Planck's law, WIKI, [Online]. Available: https://en.wikipedia.org/wiki/ Planck's law
[19] FLIR System Inc., “Uncooled detectors for thermal imaging cameras,” Technical note, [Online]. Available:
http://www.flirmedia.com/MMC/CVS/Appl_Stories/AS_0015_EN.pdf
[20] 燿鴻科技, [Online]. Available: http://www.alltest.com.tw/
[21] S.L. Bennett, R. Goubran, and F. Knoefel, “The detection of breathing behavior using Eulerian-enhanced thermal video,” in Proc. IEEE 37th Annu. Int. Conf. Eng. Med. Biol. Soc., Milan, Italy, pp. 7474-7477, Nov. 2015.
[22] HU, Menghan, et al., “Combination of near-infrared and thermal imaging techniques for the remote and simultaneous measurements of breathing and heart rates under sleep situation,” PLOS ONE, vol. 13, no. 1, Jan. 2018.
[23] L. Su and M. Xu, “Sleep comfort analysis using a part-based mixture model with nighttime infrared video.” Neurocomputing, vol. 259, pp. 66-75, Oct. 2017.
[24] S. Kido, T. Miyasaka, T. Tanaka, T. Shimizu, and T. Saga, “Fall detection in toilet rooms using thermal imaging sensors,” in Proc. IEEE Int. Conf. Symp. S.I., Tokyo, Japan, pp. 83-88, Jan. 2009.
[25] K. Song, Y. Nho, and D. Kwon, “Histogram based fall prediction of patients using a thermal imagery camera,” in Proc. IEEE 14th Int. Conf. URAI, Jeju, South Korea, pp. 161-164, Jul. 2017.
[26] FLIR System Inc., “FLIR LEPTON® Long Wave Infrared (LWIR) Datasheet”, [Online]. Available:
https://www.flir.com/globalassets/imported-assets/document/lepton-engineering-datasheet---without-radiometry.pdf
[27] F. Leens, “An Introduction to I2C and SPI Protocols,” IEEE Instru. and Meas. Mag., vol. 12, no. 1, pp. 8-13, Jan. 2009.
[28] FLIR System Inc., “Video over SPI (VoSPI) Implementaion Specification”, [Online]. Available:
https://ctl-commerce.com/client_info/CTLCOMMERCE/infoimage/FLIR/80x60
%20Lepton%20VoSPI%20Developer%20Guide.pdf
[29] K. O'Shea and R. Nash, “An Introduction to Convolutional Neural Networks,” arXiv preprint arXiv, 1511.08458, 2015.
[30] CNN, WIKI, [Online]. Available: https://en.wikipedia.org/wiki/CNN
[31] WordPress.com, [Online]. Available: https://chtseng.wordpress.com/2017/09/12/初探卷積神經網路/
[32] Medium, [Online]. Available:
https://medium.com/@chih.sheng.huang821/機器-深度學習-基礎介紹-損失函數-loss-function-2dcac5ebb6cb
[33] T. Kawashima et al., “Action recognition from extremely low-resolution thermal image sequence,” in Proc. 14th IEEE Int. Conf. AVSS, Lecce, Italy, pp. 1-6, Sep. 2017.
[34] Z. Chen, Y. Wang, “Sleep monitoring using an infrared thermal array sensor,” in Proc. Sensors and Smart Struc. Technol. for Civil, Mec., SPIE 10970, Mar. 2019.
電子全文 Fulltext
本電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。
論文使用權限 Thesis access permission:自定論文開放時間 user define
開放時間 Available:
校內 Campus:開放下載的時間 available 2025-08-04
校外 Off-campus:開放下載的時間 available 2025-08-04

您的 IP(校外) 位址是 3.17.28.48
現在時間是 2024-04-20
論文校外開放下載的時間是 2025-08-04

Your IP address is 3.17.28.48
The current date is 2024-04-20
This thesis will be available to you on 2025-08-04.

紙本論文 Printed copies
紙本論文的公開資訊在102學年度以後相對較為完整。如果需要查詢101學年度以前的紙本論文公開資訊,請聯繫圖資處紙本論文服務櫃台。如有不便之處敬請見諒。
開放時間 available 2025-08-04

QR Code