The IRMA Community
Newsletters
Research IRM
Click a keyword to search titles using our InfoSci-OnDemand powered search:
|
Egocentric Landmark-Based Indoor Guidance System for the Visually Impaired
Abstract
In this paper, we introduce an egocentric landmark-based guidance system that enables visually impaired users to interact with indoor environments. The user who wears Google Glasses will capture his surroundings within his field of view. Using this information, we provide the user an accurate landmark-based description of the environment including his relative distance and orientation to each landmark. To achieve this functionality, we developed a near real time accurate vision based localization algorithm. Since the users are visually impaired our algorithm accounts for captured images using Google Glasses that have severe blurriness, motion blurriness, low illumination intensity and crowd obstruction. We tested the algorithm performance in a 12,000 ft2 open indoor environment. When we have mint query images our algorithm obtains mean location accuracy within 5ft., mean orientation accuracy less than 2 degrees and reliability above 88%. After applying deformation effects to the query images such blurriness, motion blurriness and illumination changes, we observe that the reliability is above 75%.
Related Content
Aswathy Ravikumar, Harini Sriraman.
© 2023.
18 pages.
|
Ezhilarasie R., Aishwarya N., Subramani V., Umamakeswari A..
© 2023.
10 pages.
|
Sangeetha J..
© 2023.
13 pages.
|
Manivannan Doraipandian, Sriram J., Yathishan D., Palanivel S..
© 2023.
14 pages.
|
T. Kavitha, Malini S., Senbagavalli G..
© 2023.
36 pages.
|
Uma K. V., Aakash V., Deisy C..
© 2023.
23 pages.
|
Alageswaran Ramaiah, Arun K. S., Yathishan D., Sriram J., Palanivel S..
© 2023.
17 pages.
|
|
|