International Conference on Advancements in Computing & Management (ICACM-2019)
ed
Touch Tap See: A Navigation Assistant for the Visually Impaired person
Deepti Patolea,Sunayana Jadhava, Khyatee Thakkarb, Saket Guptab, Shardul Aeerb
iew
a
Assistant Professor, K.J. Somaiya College of Engineering, Mumbai-400077, India
b
Student, K.J. Somaiya College of Engineering, Mumbai-400077, India
ARTICLE INFO ABSTRACT
ev
Article history: With the exponential advancements in technology each year, a plethora of applications have been
Received 03 March 19 developed ranging from the most basic to the most advanced use cases. The amelioration in the
Received in revised form 07 March 19 smartphone industry has proven to be a major factor in the rapid development of mobile applications.
Accepted 06 April 19 Although technology has been evolving, we are yet to see a big break in the medical field to help the
visually impaired people. The visually impaired person misses many opportunities because of a lack of
rr
Keywords: confidence or dependence on others. The proposed work is aiming at reducing this dependence and at the
Navigation assistant same time improving the confidence of visually impaired person by enabling them to use their other
indoor navigation sensory organs to understand their surroundings. The proposed application “Navigation Assistant for
visually impaired Visually impaired Person- Touch Tap See” is aimed at providing real-time assistance to a visually
firebase
ee
impaired person at workplaces or Educational Institutes, which are generous to offer the opportunity to
mobile application them along with other common people
© [Link] by SSRN. All rights reserved.
tp
Peer review under responsibility of International Conference on Advancements in Computing & Management.
1. Introduction
Technology evolves quite rapidly. With the exponential increase in technology, we can say that new innovation is being produced every day. These
no
technologies play a vital role in our lives as they help reshape our world and make a better future. Based on the data as of in 2018, 1.3 billion people in the
world suffer from some type of visual impairment. These visually impaired people encounter various problems when it comes to navigation due to
dependency on various people for guidance. Functioning in new environments poses a great challenge for them as they are unable to navigate the
environment independently. There have been significant attempts to encounter this problem but no concrete results have been achieved.
The proposed navigation assistant-“Touch Tap See” will be responsible for guiding the visually impaired person to understand the surroundings so that
the person using application can independently and confidently reach his/her destination. For this purpose, the system is designed using android
application which will be taking inputs from Bluetooth beacons placed in the appropriate locations to get the current location of the user. The navigation
int
will be based on the destination user will have to provide using voice command to the application. The application will be making blended use of sensors
commonly present in any Smart Phone such as Bluetooth transceivers, GPS sensors to figure out the most suitable path from the current location to the
desired destination. Accordingly, the voice instructions will be given to the user. When the user will actually walk on the path suggested by application the
application will assist by voice suggestions to the user regarding the path to be taken in detail. This will be done by communication between the Bluetooth
beacons and the Mobile device. This application will also be supported by another electronic unit, which will provide the information in terms of
r
vibrations to the end user. This unit will work in standalone fashion to provide confidence to the user as well as avoid dependence on a single source of
ep
assistance.
2. Related Work
All Several innovations have been proposed to help the visually impaired. In (Villanueva, Joselin, and René Farcy,2012).Dr. Stephen Hicks, from the
Pr
University of Oxford, has created a pair of glasses which can help visually impaired people in better understanding their surroundings, thus starting an
initiative to make them independent. Most visually impaired people can differentiate bright and dark light. These glasses increase the brightness in an
April 13-14, 2019 | Jagannath University, Jaipur, India Page 521
This preprint research paper has not been peer reviewed. Electronic copy available at: [Link]
International Conference on Advancements in Computing & Management (ICACM-2019)
ed
attempt to let the wearer understand the presence of an obstacle in front of them. As suggested on website (Website:
[Link] accessed on 1st April 2019), the technology consists of a chunky ring that helps a visually
impaired person to read a normal book. The ring transcribes a 12-point letter and reads the word out loud. It also informs the user if his finger diverges
from a line. This is currently in the prototype stage. In(Hu JY, Yan L, Chen YD, et al,2017),(Daniele CROCE,2014), The ARIANNA app (pAth
Recognition for Indoor Assisted NavigatioN with Augmented perception) solves the potential problem of using GPS indoors. It uses colored tape to mark
iew
the routes inside the house. The users of the app just have to point their smartphone cameras and once the camera detects a colored tape, it will notify the
user of the direction that he has to go in. In (Yuanqing Zheng,2014), Travi-Navi is a vision-guided navigation system that enables a user to easily deploy
indoor navigation services. In (Zuwei Yin,2017), peer to peer navigation technology has been used in order to help the user navigate to a particular
location. In (Jiang Dong,2015), iMoon creates a navigation mesh which gets generated from 3-D models which are obtained from 2-D photos taken by
people. It has been suggested in article (Dragan Ahmetovic,2017), navigation assistance to visually impaired can be effectively provided by use of
Bluetooth based systems. In summary of the papers reviewed, it is established that the implementation of computational geometry, artificial intelligence
and ultrasound techniques is to be incorporated but the user-oriented approach has been minimal and instead other external obstacles have been addressed.
ev
3. Proposed System
In this paper, the Touch-Tap-See is proposed, that provides an enhanced sense of the navigation environment which works with the help of Bluetooth
rr
beacons (Dragan Ahmetovic,2017), for the indoor navigation of the environment. It also uses a Firebase database for the purpose of location sharing.
Whenever the user enters the building which is equipped with Bluetooth beacons and opens the application, various markers related to nearby locations
will be displayed on the map and will be intimated to the user using voice-over. Upon selection of a certain location, the path to that location will be
displayed on the screen and voice instructions will be given accordingly to the application user. We propose the use of Ultrasonic sensors (Daniele
CROCE,2014), for obstacle detection which will run on a separate base and provide a better understanding of the environment.
ee
The system is divided into two subsystems:
The Mobile Application –Touch Tap See
A standalone obstacle avoidance Module
tp
The Assistance provided by the Obstacle avoidance module which is proposed as a standalone wristband will be generating alerts in the form of
vibrations generated by output actuators. The purpose of this module is to avoid the complete dependency on mobile application and the Bluetooth
beacons installed in the [Link] mobile application will be responsible for providing voice assistance to the user regarding the indoor navigation
suggestions.
no
int
[Link] Application block diagram Fig. 2. Standalone obstacle avoidance Module
r
4. Algorithm Used
ep
The main objective of the algorithm is to generate way points (S. Willis, S. Helal,2005) and calculate shortest possible path between source and
destination. This algorithm is used as it provides accurate results in short span of time.
Pr
April 13-14, 2019 | Jagannath University, Jaipur, India Page 522
This preprint research paper has not been peer reviewed. Electronic copy available at: [Link]
International Conference on Advancements in Computing & Management (ICACM-2019)
ed
iew
ev
Fig.3. Compass rose
[Link] direction is calculated in ENU coordinates or bearings. In simple words, North is 0 degree, East is 90.
rr
[Link] is calculated in the same format.
[Link] find the difference and use the below logic in left, right, back, straight.
[Link] have created a simple algorithm wherein first we add different legs and then find the nearest leg available.
[Link] reach the nearest leg, the approximate distance is calculated from the current leg to the nearest leg and then using text to speech, the
corresponding command is given.
[Link] same direction is also visible on the application screen.
ee
5. System Architecture
TouchTapSee system architecture which was briefly discussed in the introduction consists of the following components: Database, Android Client,
tp
Bluetooth Beacon.
no
int
[Link] architecture
r
5.1 Database:
ep
The database is used to store the client information such as his/her login details and also to share the geographical location.
5.2 Android Smartphone:
Components of the software implemented in the Android Smartphone are as follows:
Pr
April 13-14, 2019 | Jagannath University, Jaipur, India Page 523
This preprint research paper has not been peer reviewed. Electronic copy available at: [Link]
International Conference on Advancements in Computing & Management (ICACM-2019)
ed
a. Bluetooth module: This module is responsible for exchanging data between the Android Smartphone and the Bluetooth Beacon.
b. TouchTapSee application: This application has various events which can be chosen by the user. The events are processed locally, and the output is
converted into an audio form.
c. Text To Speech Engine: as the TouchTapSee system is designed to assist the legally blind, the interaction between the user and the
smartphone is done with the help of voice commands.
iew
6. Implementation Tools
Keeping in mind the advanced technologies available for providing sleek, modern and efficient functionalities, the system has taken advantage of the
followingtechnologies to develop this application:
6.1 Android Studio:
ev
It is the officially integrated development environment (IDE) for Google's Android operating system, built on JetBrains' IntelliJ IDEA software and
designed specifically for Android development. It gave us the required options with ease to implement the project.
6.2 Google Firebase:
rr
Firebase Cloud Storage provides features to quickly and easily store and serve user-generated content, such as photos and videos, effortlessly grow
from prototype to production using the same technology. Real-time syncing of data is required in order to achieve real-time location sharing which is done
through Firebase Real-time Database which is a cloud-hosted NoSQL database that lets you store and sync JSON data between users in real time.
NoSQL feature provides flexibility in the schema of the database.
ee
6.3 IndoorAtlas:
IndoorAtlas is an open source indoor positioning platform for mobile applications providing features like indoor positioning, wayfinding, geofencing,
asset tracking, and location intelligence.
tp
7. Compare and Contrast
The proposed system is compared with the existing systems in the same area the observations are presented in the following table.
no
Table 1 - Comparison with existing systems.
Comparison Factor Ariana App TouchTapSee TraviNavi
Tactile Interface ✔ ❌ ✔
Haptic Feedback ✔ ✔ ✔
Augmented Reality ✔ ❌ ❌
Phone Camera not required ❌ ✔ ❌
int
QR Code not required ❌ ✔ ✔
Stable net connection required ❌ ✔ ❌
Marker Tape not Required ❌ ✔ ✔
Voice-activated Commands ❌ ✔ ❌
r
ep
8. Results and Discussion
The Application Touch Tap See is implemented and tested. The results of the proposed system prototype are fairly promising. The floor map was
effectively used to mark the locations nearby. The bluetooth beacons provided the required accuracy with least power consumption.
Pr
April 13-14, 2019 | Jagannath University, Jaipur, India Page 524
This preprint research paper has not been peer reviewed. Electronic copy available at: [Link]
International Conference on Advancements in Computing & Management (ICACM-2019)
ed
iew
ev
rr
Fig.5. User location during indoor navigation
This is how the map will initially appear before the search is made.
ee
tp
no
int
Fig.6. Highlighted Waypoints Fig. Fig.7. Path from Source to Destination
Above figures shows how it appears after the location has been identified via Bluetooth beacons and how the path looks like from the current location to
the place which is searched for
r
9. Future Scope
ep
The final outcome of the research work is expected to be a navigation assistant consisting of a Mobile app and a wearable obstacle avoidance unit which
will assist the blind person to navigate himself independently in the indoor area. We plan to include more detailed assistance to blind people such as
briefing the blind person about the place he or she is currently positioned and the path he /she will take to reach the destination. The system can be made
offline (not using mobile data) using the network signal of the network service provider. The way-finding algorithm can be extended to process the
Pr
April 13-14, 2019 | Jagannath University, Jaipur, India Page 525
This preprint research paper has not been peer reviewed. Electronic copy available at: [Link]
International Conference on Advancements in Computing & Management (ICACM-2019)
ed
distance data measured as per the signal strength of the network. Availability of data services is a major issue going ahead for the system to work at a
mass scale.
10. Conclusion
iew
The current systems available in the market are not viable options for everyone since they are paid and provide very less accuracy. Way finding algorithm
gives a much higher accuracy compared to the generalized algorithms used in the available systems. Obstacle detection as well as indoor and outdoor
navigation provides an overall package in order to lead a productive and efficient life. Application crashes are effectively handled by obstacle
detection module which negates complete dependency on the application. Hence the proposed system provides a robust solution to the problem faced by
visually impaired people in handling their basic requirements of indoor navigation. Thus the system serves as a helping hand and also maintains the
dignity of visually impaired person.
ev
Acknowledgements
Author would like to acknowledge the management of K J Somaiya College of Engineering, for supporting this Research Work.
rr
REFERENCES
Villanueva, Joselin, and René Farcy. "Optical Device Indicating a Safe Free Path to Blind People", IEEE Transactions on Instrumentation and
Measurement, 2012.
ee
Website: [Link] accessed on 1stApril 2019
Hu JY, Yan L, Chen YD, et al. Population-based survey of prevalence, causes, and risk factors for blindness and visual impairment in an aging Chinese
metropolitan population. Int J Ophthalmol. 2017;10(1):140–147. Published 2017 Jan 18. doi:10.18240/ijo.2017.01.23
Daniele CROCE, Pierluigi GALLO, Domenico GARLISI, Laura GIARRE, Stefano MANGIONE, Ilenia TINNIRELLO,”ARIANNA: a smartphone-based
navigation system with human in the loop”. 2014
tp
Yuanqing Zheng, Guobin Shen, Liqun Li, Chunshui Zhao, Mo Li, Feng Zhao, “Travi- Navi: Self-Deployable Indoor Navigation System” 2014
Zuwei Yin, Chenshu Wu, Zheng Yang, and Yunhao Liu, “Peer-to-Peer Indoor Navigation Using Smartphones”, 2017
Jiang Dong, Yu Xiao, Marius Noreikis, Zhonghong Ou Antti Ylä-Jääski, “iMoon: Using Smartphones for Image-based Indoor Navigation”. 2015
Dragan Ahmetovic, Masayuki Murata, Cole Gleason, Erin Brady, Hironobu Takagi, Kris Kitani, Chieko Asakawa, “Achieving Practical and Accurate
Indoor
Navigation for Peo ple with Visual Impairments”, The 14th International Web for All Conference on April 2, 2017.
no
S. Willis, S. Helal. RFID information grid and wearable computing solution to the problem of wayfinding for the blind user in a campus environment,
IEEE International Symposium on Wearable Computers (ISWC 05), 2005
r int
ep
Pr
April 13-14, 2019 | Jagannath University, Jaipur, India Page 526
This preprint research paper has not been peer reviewed. Electronic copy available at: [Link]