The IRMA Community
Newsletters
Research IRM
Click a keyword to search titles using our InfoSci-OnDemand powered search:
|
Estimating which Object Type a Sensor Node is Attached to in Ubiquitous Sensor Environment
|
Author(s): Takuya Maekawa (NTT Communication Science Laboratories, Japan), Yutaka Yanagisawa (NTT Communication Science Laboratories, Japan)and Takeshi Okadome (NTT Communication Science Laboratories, Japan)
Copyright: 2012
Pages: 14
Source title:
Breakthroughs in Software Science and Computational Intelligence
Source Author(s)/Editor(s): Yingxu Wang (University of Calgary, Canada)
DOI: 10.4018/978-1-4666-0264-9.ch023
Purchase
|
Abstract
By simply attaching sensor nodes to physical objects with no information about the objects, the method proposed in this paper infers the type of the physical indoor objects and the states they are in. Assuming that an object has its own states that have transitions represented by a state transition diagram, we prepare the state transition diagrams for such indoor objects as a door, a drawer, a chair, and a locker. The method determines the presumed state transition diagram from prepared diagrams that matches sensor data collected from people’s daily living for a certain period. A 2 week experiment shows that the method achieves high accuracy of inferring objects to which sensor nodes are attached. The method allows us to introduce ubiquitous sensor environments by simply attaching sensor nodes to physical objects around us.
Related Content
Bhargav Naidu Matcha, Sivakumar Sivanesan, K. C. Ng, Se Yong Eh Noum, Aman Sharma.
© 2023.
60 pages.
|
Lavanya Sendhilvel, Kush Diwakar Desai, Simran Adake, Rachit Bisaria, Hemang Ghanshyambhai Vekariya.
© 2023.
15 pages.
|
Jayanthi Ganapathy, Purushothaman R., Ramya M., Joselyn Diana C..
© 2023.
14 pages.
|
Prince Rajak, Anjali Sagar Jangde, Govind P. Gupta.
© 2023.
14 pages.
|
Mustafa Eren Akpınar.
© 2023.
9 pages.
|
Sreekantha Desai Karanam, Krithin M., R. V. Kulkarni.
© 2023.
34 pages.
|
Omprakash Nayak, Tejaswini Pallapothala, Govind P. Gupta.
© 2023.
19 pages.
|
|
|