DSpace Repository

Human Computer Interaction System for Impaired People by using Kinect Motion Sensor : Voice and Gesture Integrated Smart Home

Show simple item record

dc.contributor.author Rathnayake, K.A.S.V
dc.contributor.author Wanniarachchi, W.K.I.L
dc.contributor.author Nanayakkara, W. H. K. P.
dc.date.accessioned 2020-01-22T06:10:08Z
dc.date.available 2020-01-22T06:10:08Z
dc.date.issued 2018
dc.identifier.citation Rathnayake, K.A.S.V, et al.(2018)."Human Computer Interaction System for Impaired People by using Kinect Motion Sensor : Voice and Gesture Integrated Smart Home", 2nd International Conference on Inventive Communication and Computational Technologies (ICICCT 2018) en_US
dc.identifier.uri http://dr.lib.sjp.ac.lk/handle/123456789/8844
dc.description.abstract Gesture and speech based Human computer interaction is one of the most natural and convenient way of communication which leads to minimize the gap between human and machine. In this paper, we propose a single HCI system which can be utilized by people with physical challenges as well as speaking and hearing disabilities. Voice and gesture commands acquired by Kinect motion sensor are availed to manipulate home appliances via a Wi-Fi enabled wireless network hub. Even though nowadays, there is a pool of HCI systems with various technologies, cost and accuracy of those systems are debatable factors. Experiments carried out in the evaluation process reveal that our system provides more than 80% of recognition accuracy in gesture mode as well as in voice mode. Moreover, it can work well in uncontrolled, environments. en_US
dc.language.iso en en_US
dc.subject Microsoft Kinect V2; HCI; Gesture recognition; Voice recognition; Smart home; NodeMcu. en_US
dc.title Human Computer Interaction System for Impaired People by using Kinect Motion Sensor : Voice and Gesture Integrated Smart Home en_US
dc.type Article en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Browse

My Account