Intelligent Assistive System for the Visually Impaired
Intelligent Assistive System for the Visually Impaired
fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 1
Manuscript received February 26, 2020; revised April 13, 2020; accepted
R ECENTLY, the World Health Organization (WHO),
released a global population statistical report [1] in 2017
noting that of the approximately 7.338 billion people in the
April 23, 2020
This work was supported in part by the Ministry of Science and Technology world, approximately 285 million people were visually
(MoST), Taiwan, R.O.C., under the following grants: MOST impaired as of the end of 2016. Fig. 1 shows the proportion of
108-2637-E-218-004 and MOST 108-2622-8-218-004-TE2. This work was
also supported in part by the Ministry of Education (MoE), Taiwan, R.O.C.,
visually impaired people in the total world population. Among
under Grant 1300-107P735 through the Advanced Intelligent Biomedical these visually impaired people, 39 million were blind, and 246
Allied Research Center under the Higher Education Sprout Project. An earlier million had amblyopia, together accounting for 4.25% of the
and briefer version of this paper was presented at the 2019 IEEE ICCE [30], Las
total population of the world.
Vegas, NV, USA, and was published (DOI: 10.1109/ICCE.2019.8661943) in
its Proceedings. (Corresponding author: Wan-Jung Chang.) Whether due to congenital inheritance, acquired diseases,
W.-J. Chang is with the Department of Electronic Engineering, Southern accidental injuries, or other reasons, visual impairment causes a
Taiwan University of Science and Technology, Tainan 71005, Taiwan, R.O.C. certain level of inconvenience to these lives. For example, it is
(e-mail: allenchang@[Link]).
L.-B. Chen is with the Department of Electronic Engineering, Southern very difficult for visually impaired people to walk
Taiwan University of Science and Technology, Tainan 71005, Taiwan, R.O.C. independently in strange and complex areas.
(e-mail: [Link]@[Link]). Currently, visually impaired people mostly use white sticks,
M.-C. Chen is with the Department of Electronic Engineering, Southern
Taiwan University of Science and Technology, Tainan 71005, Taiwan, R.O.C. guide dogs, or voice guidance devices (built-in A-GPS) to
(e-mail: jerryhata@[Link]). provide directions to improve their walking safety when they
J.-P. Su is with the Department of Electronic Engineering, Southern Taiwan go out independently. However, in complex outdoor
University of Science and Technology, Tainan 71005, Taiwan, R.O.C. (e-mail:
jianpingsu1111@[Link]). environments, visually impaired people still face the following
C.-Y. Sie is with the Department of Electronic Engineering, Southern two common dangerous walking situations [2]-[5].
Taiwan University of Science and Technology, Tainan 71005, Taiwan, R.O.C. 1) Collisions with Aerial Obstacles in Front
(e-mail:)
C.-H. Yang is with the Department of Electronic Engineering, Southern
According to Manduchi and Kurniawan [3], 15% of visually
Taiwan University of Science and Technology, Tainan 71005, Taiwan, R.O.C. impaired people hit obstacles every month on average, and 40%
(e-mail: 4a339036@[Link]) of visually impaired people fall down every year because they
1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 2
impaired people.
To address fall detection and prevention, Chaccour et al. [11]
performed a generic classification of existing fall-related
systems from fall detection to fall prevention. They also
suggested some future development directions for assistive
devices for visually impaired people. Kau and Chen [12]
proposed a fall accident detection system based on a
smartphone with 3G mobile networks. A state machine was
implemented for fall detection, and the corresponding features
were investigated in a sequential manner.
Generally, walking sticks have become a basic aid for
visually impaired people. Several related works [13]-[19]
developed an intelligent walking stick to assist visually
impaired people in obstacle avoidance or walking guidance.
Wang and Kuchenbecker [13] presented an assistive device
called HALO, which was mounted on a traditional white stick
to alert for low-hanging obstacles via haptic cues. Sharma et al.
[14] designed a smart stick that assisted visually impaired
Fig. 2. Two examples of front aerial obstacles. people in safely moving around holes, water, obstacles, and
stairs. Cardillo et al. [15] designed an electromagnetic
sensor-based system that was mounted on a conventional white
hit obstacles. In particular, aerial obstacles, such as awnings, stick to assist with the autonomous walking of visually
tree branches, and similar objects, typically have no projection impaired people. Villanueva and Farcy [16] implemented an
on the ground or floor [4]. Moreover, these aerial obstacles
active optical-based electronic travel assistive device that used
cannot be detected by a general walking stick. Two examples
a photodiode and an LED. This device was integrated with a
of aerial obstacles are shown in Fig. 2.
general white stick.
2) High Fall Risk
Currently, artificial intelligence (AI) techniques have been
Legood et al. [5] pointed out that a visually impaired
applied to walking sticks to create smart walking sticks [17],
person’s chance of falling is 1.7 times that of the average
[18]. For example, Degaonkar et al. [17] presented a smart stick
person, and the probability of subsequent falling after standing
that used AI with ultrasonic sensors to help visually impaired
up is 1.9 times greater. When a visually impaired person walks
people navigate. Image recognition, collision detection, and
alone and falls, he may feel less confident and helpless when
obstacle detection were provided by the presented smart stick.
walking alone, potentially resulting in serious injury.
Pruthvi et al. [18] designed a smart blind stick that adopts a
To overcome these two issues mentioned above, we design
popular deep learning training model, YOLO [19], to provide
and implement an intelligent assistive system for visually
object detection and classification. This device provides object
impaired people to provide functionalities for outdoor walking
information by voice for visually impaired people.
safety, including detection and notification of falls and
In recent years, many previous works have investigated
collisions with aerial obstacles.
guided walking and navigation applications. For example, Bai
The remainder of this paper is organized as follows. Section
et al. [20] designed an electronic travel aid (ETA) for which
II reviews related work. The proposed assistive system is
smart glasses were adopted as a major wearable device to assist
described in Section III. A prototype of the proposed assistive
with indoor travel for visually impaired people. The ETA was
system and related experiments are provided in Section IV.
implemented by integrating a depth sensor with an ultrasonic
Finally, Section V concludes this work and offers a discussion
sensor to solve the problem of detecting few obstacles.
of future research directions.
Moreover, they further developed a dynamic sub-destination
routing plan [21] that can assist visually impaired people in
II. RELATED WORK
bypassing obstacles in front and simultaneously guide them to
Many novel related studies [6]-[30] for assisting and helping their destination.
visually impaired people were reviewed. Elmannai and Elleithy Lee et al. [22] proposed a wearable glasses-based indoor
[6] surveyed the present development status of sensor-based positioning system. Ultrasound and image sensors were
assistive aids/devices and provided some possible future mounted on the wearable glasses. The proposed wearable
development trends. Tapu et al. [7] surveyed the state of the art glasses can identify certain color-coded markers in real time
in wearable assistive device development prior to 2018. Islam and detect obstacles in front within a distance of 15 meters.
et al. [8] reviewed the development of assistive walking Aladrén et al. [23] presented a navigation assistive device
devices. Hakobyan et al. [9] surveyed the modern development that adopted a consumer RGB-D image sensor to take
of mobile assistive technologies for visually impaired people. advantage of both wide range and color visual information for
Furthermore, Zubov [10] proposed a thin client-based concept detecting obstacle-free paths.
of a smart city assistive infrastructure for blind and visually Ramadhan [24] proposed a wearable system composed of a
1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 3
1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 4
and IR transceiver sensors. We choose an IR transceiver sensor 2) Triangulation Method for Distance Detection of Aerial
for aerial obstacle detection in this paper because such sensors Obstacles in Front
are inexpensive, and visually impaired persons tend to be As shown in Fig. 7, we use the triangulation method [32] for
financially limited. distance detection of aerial obstacles in front. The main reason
As a result, we need to detect aerial obstacles by adopting an is that the detection distance of aerial obstacles in front is not
IR transceiver sensor. We choose an IR transceiver sensor easily affected by the reflectivity of various objects or the
module (SHARP GP2Y0A710K0F) [31] that can measure ambient temperature and operation time. The triangulation
distances between 1 and 5.5 meters. As shown in Fig. 5, the method is based on the assumption that reference points with
output voltage of the adopted IR transceiver sensor module known coordinates can form a triangle. Two known coordinates
corresponds to the detection distance. are those of the IR emitting diode (TX) and the position IR
Therefore, we can use the change in voltage to determine the sensor diode (RX) on the IR transceiver sensor module, and the
distance. The induction distance we set is 0.5 to 3 meters and measurement target point is an obstacle. As a result, we can
represents the distance to an object (obstacle) when the calculate the length of the reference side of the triangle and
corresponding voltage is more than 1.5 V. measure the angle between the two reference points and the
target point to find the distance and coordinates of the target
point.
Fig. 5. The relationship between the output voltage and the measured distance
[31].
Fig. 6. The relationship between the output voltage and the inverse of distance Fig. 8. The motion tracking sensor module (6-axis sensor module) is used to
[31]. judge five postures (states) of the human body, including standing, sitting,
lying, running and walking.
1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 5
x z
arctan (1)
Fig. 9. The accelerometer and gyro signals of the motion tracking sensor
x(t k + 1) ≈ x(tk ) + (Δ x 1 + 2 Δ x 2 + 2 Δ x 3 + Δ x 4)/6 (4)
module can obtain the vertical angle (θAcc) between the upper human body
(smart glasses) and the ground (g) during movement. 4) Posture Conversion Recognition
Then, we calculate the maximum difference (XMaxDiff)
Hence, the vertical angle (θAcc) is added to the gyro sensor of between the maximum and minimum vertical acceleration
the motion tracking sensor module to measure the value of the measured during each sampling period i (XMaxDiff =
angle θGyro, which is calculated using the multisensor data Max{ai}-Min{ai}, ∀i=1, 2, 3, ……, n). In the ith sampling
fusion Kalman filtering model [35] to obtain the corrected period, if XMaxDiff>Dth (preset threshold value), then the
upper human behavior tilt angle θ value, as shown in equations posture conversion is determined, and an action is performed
(2) and (3). for recognition of motion patterns in the ith sampling period.
The Kalman filtering model is used to filter noise from the Through this recognition, it is known that if the acceleration
accelerometer. Please note that uk means the input data and bias of the motion sensor module configured in the wearable smart
means the bias of the gyro measurement. glasses is greater than a certain preset threshold value Dth (i.e.,
XMaxDiffx>Dth) during the ith sampling period, this means that
dt (2) the wearable smart glasses are currently in a state of a rapid
change in tilt.
1 dt θ dt Next, we supplement the posture conversion recognition to
(3)
0 1 0 determine the fall posture, which can improve the accuracy of
fall detection and reduce the probability of misjudgment. In
2) Upright and Tilt Action Classification other words, we integrate the center of the human torso with the
When we obtain the value of the corrected upper human posture (state) changes for standing, sitting, lying, running, and
behavior tilt angle θ, a threshold value of the angle can be used walking to determine the fall posture (state), as shown in Fig.
for classification. The threshold value of the angle is set to 30 11.
degrees. When the measured value of the angle is less than 30
1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 6
change in tilt.
Power
On
Upright
Stage
Tilt
Stage
2) Feature Acquisition and Analysis
Similar to that of the wearable smart glasses, the
accelerometer signal of the motion tracking sensor module can
lie
state
stand
state
walk
state
run
state
obtain the vertical angle (θAcc) between the proposed intelligent
walking stick and the ground (g) during movement, as shown in
equation (1) and Fig. 12.
sit fall
state state
Fig. 11. Posture (state) changes for standing, sitting, lying, running, and
walking to determine the fall posture (state).
1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 7
TABLE I
DISTANCE CONVERSION VOLTAGE MEASUREMENTS BY THE ADOPTED IR TRANSCEIVER SENSOR MODULE
Minimum Output Maximum Output Minimum Output Maximum Output Minimum Output Maximum Output
Distance Average Output
Voltage (V) Voltage (V) Voltage (V) Voltage (V) Voltage (V) Voltage (V)
(cm) Voltage (V)
@ 47% RH @ 47% RH @ 71% RH @ 71% RH @ 95% RH @ 95% RH
50 3.09 3.14 3.06 3.11 3.05 3.09 3.09
100 2.48 2.54 2.46 2.53 2.43 2.46 2.48
150 2.23 2.26 2.23 2.25 2.21 2.24 2.24
200 1.68 1.70 1.68 1.69 1.67 1.68 1.68
250 1.56 1.63 1.54 1.60 1.53 1.57 1.57
300 1.51 1.57 1.53 1.58 1.50 1.55 1.54
350 1.48 1.53 1.46 1.52 1.47 1.50 1.49
400 1.47 1.52 1.45 1.51 1.43 1.46 1.47
450 1.45 1.48 1.40 1.43 1.38 1.41 1.43
500 1.38 1.43 1.36 1.39 1.21 1.31 1.35
Start
Performing fall
recognition
IR detects aerial obstacles. Yes
Yes
Uploading fall event and its GPS
location to the cloud-based
information management
platform via LoRa-based LPWAN
Hence, the location and time of occurrence are uploaded to from the proposed wearable smart glasses, this fall event will be
the proposed cloud-based information management platform ignored due to misjudgment by the smart glasses. Through the
for storage, and a message is forwarded to the V-Protector app fusion of the two fall recognition methods of the proposed
to alert the family members or caregivers of the visually wearable smart glasses and the proposed intelligent walking
impaired user. However, if the proposed intelligent walking stick, the fall event recognition rate can be improved;
stick does not recognize a fall event but receives a fall signal moreover, false alarms can be avoided.
1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 8
F. LoRa-Based Low-Power Wide-Area Network (LPWAN) the standard functions of the 8051 MCU, many system-level
Visually impaired people mostly encounter dangerous events functions have been integrated into this MCU.
outdoors, they usually do not have the habit of carrying a These functions can effectively reduce the circuit board area
smartphone, and they always hold a walking stick to assist them and cost. In other words, we need a miniature circuit board for
while walking on outdoor roads. Hence, it is not suitable for a embedding circuit modules into the proposed wearable smart
visually impaired user to need a smartphone to send fall event glasses. To achieve this purpose, we design and implement the
messages or to seek assistance. Based on the abovementioned PCB, including an MCU, a BLE module, a 6-axis sensor
reasons, we implement the function of fall event announcement module, and a battery charging module, as seen in Fig. 14 (b) to
on the proposed intelligent walking stick via LoRa-based (d).
LPWAN [37]-[39]. The proposed intelligent walking stick does not have space
Finally, the flowchart of the proposed intelligent assistive limitations. Hence, some existing development boards (such as
system is shown in Fig. 13. Raspberry Pi [41] and LinkIt™ ONE [42]) can be selected and
used as an MPU for the proposed walking stick. Thus, we adopt
IV. PROTOTYPE DEMONSTRATION AND EXPERIMENTS the LinkIt™ One development board as the MPU for the
walking stick
Figs. 14 and 15 demonstrate photographs of a prototype
The LinkIt™ ONE development board integrates a BLE
implementation of the proposed wearable smart glasses and the
module, a LoRa-based LPWAN communication module, a
proposed intelligent walking stick, respectively.
GPS module, a 6-axis (gyro + accelerometer) MEMS motion
Fig. 14 shows a prototype of the proposed wearable smart
tracking sensor, and a vibration motor module, where the BLE
glasses. As shown in Fig. 14, an IR transceiver sensor module
module is responsible for pairing and connecting with the
(see Fig. 14 (a)), a 6-axis (gyro + accelerometer)
proposed wearable smart glasses.
microelectromechanical system (MEMS) motion tracking
When the intelligent walking stick detects an aerial obstacle
sensor module (see Fig. 14 (d)), a microcontroller unit (MCU)
from the proposed wearable smart glasses, the vibration motor
(see Fig. 14 (c)), a Bluetooth low energy (BLE) wireless
is actuated to warn the visually impaired person via vibration.
communication module (see Fig. 14 (d)), and a battery charging
Moreover, the LoRa-based LPWAN module is responsible for
module (see Fig. 14 (b)) are mounted on the wearable smart
uploading the fall information with GPS to the proposed
glasses. The proposed wearable smart glasses and the proposed
cloud-based information management platform, in which the
intelligent walking stick are connected and paired by BLE
6-axis (gyro + accelerometer) MEMS motion tracking sensor
wireless communication to achieve the fusion of the two fall
mounted on the proposed intelligent walking stick is
recognition methods.
responsible for co-recognition with the proposed wearable
smart glasses. Hence, the current posture of the visually
impaired person can be recognized to determine whether the
person has truly fallen.
Fig. 15 shows a photograph of a prototype of the proposed
intelligent walking stick. As shown in Fig. 15, a vibration
motor, a GPS module, an MPU, a BLE wireless communication
module, a LoRa-based LPWAN wireless communication
module, and a 6-axis (gyro + accelerometer) MEMS motion
tracking sensor module are mounted on a walking stick.
Fig. 14. Photographs of the prototype of the proposed wearable smart glasses.
1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 9
TABLE II
EXPERIMENTS ON POSTURE CONVERSION RECOGNITION FOR THREE SUBJECTS
Standing Sitting Lying Running Walking Falling Summary
Success (count) 138 137 150 145 141 140 851 times (total)
Total (count) 150 150 150 150 150 150 900 times (total)
Accuracy 92.00% 91.33% 100.00% 96.67% 94.00% 93.33% 94.56% (total)
TABLE III
EXPERIMENTS ON FALL DETECTION FOR THREE SUBJECTS
1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 10
Fig. 17 demonstrates execution screens of the proposed proposed intelligent walking stick, the fall event recognition
mobile device application (V-Protector). Fig. 16 (b) shows the rate can be improved; moreover, false alarms can be avoided.
screen reporting notification messages of accident occurrence As a result, the experimental results show that the proposed
for the visually impaired user. Fig. 16 (c) shows the screen system can efficiently detect aerial obstacles within a distance
reporting the location of the accident occurrence for the of 3 meters, and the average accuracy of fall detection reaches
visually impaired user. up to 98.3% by using the fusion of the two fall recognition
In other words, the visually impaired user will be mapped to methods.
their family members’ or caregivers’ smartphone through the Furthermore, when visually impaired people experience a
“Token” value in the event of an accident, and an emergency fall event, an urgent notification will immediately be sent to
message can be sent to the V-Protector app via the FireBase their family members or caregivers via the proposed mobile
push function to alert the family members or caregivers. In device app. When aerial obstacles in front are detected, the
addition, the intelligent walking stick will upload GPS latitude proposed intelligent stick will vibrate to guide visually
and longitude to the cloud-based information management impaired people to avoid collision accidents. In future work, we
platform every 5 minutes. As a result, the family members or will integrate deep learning techniques for recognizing front
caregivers can view the location of the visually impaired user at aerial and ground images (such as traffic signs and traffic cones)
any time in the V-Protector app. for obstacle avoidance and develop intelligent guided
walking-related functions.
ACKNOWLEDGMENTS
The authors would like to thank the Suang Lien Foundation
for the Visually Impaired, Taipei, Taiwan, R.O.C., and the
Tainan Blind Welfare Association, Tainan, Taiwan, R.O.C., for
providing many valuable comments and suggestions to this
work.
REFERENCES
[1] World Health Organization (WHO), “Visual impairment and blindness
Research,” Oct. 11 2017. [Online] Available:
URL: [Link]
[2] W.-J. Chang, J.-P. Su, L.-B. Chen, M.-C. Chen, C.-H. Hsu, C.-H. Yang,
C.-Y. Sie, and C.-H. Chuang, “An AI edge computing based wearable
Fig. 16. Execution screen of the proposed cloud-based information assistive device for visually impaired people zebra-crossing walking,” in
management platform. Proceedings of the 2020 IEEE International Conference on Consumer
Electronics (ICCE), Jan. 2020, pp. 1-2.
[3] R. Manduchi and S. Kurniawan, “Watch your head, mind your step:
mobility-related accidents experienced by people with visual
impairment,” Technical Report, Univ. California, Santa Cruz, 2010.
[4] J. M. Sáez, F. Escolano, and M. A. Lozano, “Aerial obstacle detection
with 3-D mobile devices,” IEEE Journal of Biomedical and Health
Informatics, vol. 19, no. 1, pp. 74-80, Jan. 2015.
[5] R. Legood, P. Scuffham, and C. Cryer, “Are we blind to injuries in the
visually impaired? A review of the literature,” Injury Prevention, vol. 8,
pp. 155-160, 2002.
[6] W. Elmannai and K. Elleithy, “Sensor-based assistive devices for
visually-impaired people: current status, challenges, and future
directions,” Sensors, vol. 17, no. 3, article 565, pp. 1-42, Mar. 2017.
[7] R. Tapu, B. Mocanu, and T. Zaharia, “Wearable assistive devices for
visually impaired: A state of the art survey,” Pattern Recognition Letters,
Oct. 2018, [Link]
[8] M. M. Islam, M. S. Sadi, K. Z. Zamli, and M. M. Ahmed, “Developing
(a) (b) (c) walking assistants for visually impaired people: a review,” IEEE Sensors
Fig. 17. Execution screens of the proposed mobile device application. Journal, vol. 19, no. 8, pp. 2814-2828, Apr. 2019.
[9] L. Hakobyan, J. Lumsden, D. O’Sullivan, and H. Bartlett, “Mobile
assistive technologies for the visually impaired,” Survey of
Ophthalmology, vol. 58, no. 6, pp. 513-528, Nov.-Dec. 2013.
V. CONCLUSIONS AND FUTURE WORK [10] D. Zubov, “A smart city assistive infrastructure for the blind and visually
In this paper, an intelligent assistive system has been impaired people: a thin client concept,” Broad Research in Artificial
Intelligence and Neuroscience (BRAIN), vol. 9, no. 4, pp. 25-37, Nov.
proposed for visually impaired people for aerial obstacle 2018.
avoidance and fall detection. The proposed intelligent assistive [11] K. Chaccour, R. Darazi, A. H. E. Hassani, and E. Andrès, “From fall
system comprises wearable smart glasses, an intelligent detection to fall prevention: a generic classification of fall-related
walking stick, a cloud-based information platform, and a system,” IEEE Sensors Journal, vol. 17, no. 3, pp. 812-822, Feb. 2017.
[12] L.-J. Kau and C.-S. Chen, “A smart phone-based pocket fall accident
mobile device app. Through the fusion of two fall recognition detection, positioning, and rescue system,” IEEE Journal of Biomedical
methods by the proposed wearable smart glasses and the and Health Informatics, vol. 19, no. 1, pp. 44-56, Jan. 2015.
1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 11
[13] Y. Wang and K. J. Kuchenbecker, “HALO: haptic alerts for low-hanging [34] H.-T. Lin, T.-J. Hsieh, M.-C. Chen, and W.-R. Chang, “ActionView: a
obstacles in white cane navigation,” in Proceedings of the IEEE Haptics movement-analysis ambulatory monitor in elderly homecare systems,” in
Symposium 2012, 2012, pp. 527-532. Proceedings of the 2009 IEEE International Symposium on Circuits and
[14] H. Sharma, M. Tripathi, A. Kumar, and M. S. Gaur, “Embedded assistive Systems (ISCAS), 2009, pp. 3098-3101.
stick for visually impaired persons,” in Proceedings of the 2018 9th [35] S.-L. Sun and Z.-L. Deng, “Multi-sensor optimal information fusion
International Conference on Computing, Communication and Kalman filter,” Automatica, vol. 40, no. 6, pp. 1017-1023, Jun. 2004.
Networking Technologies (ICCCNT), 2018, pp. 1-6. [36] Wikipedia, “Runge–Kutta methods,” [Online] Available: URL:
[15] E. Cardillo, V. D. Mattia, G. Manfredi, P. Russo, A. D. Leo, A. Caddemi, [Link]
and G. Cerri, “An electromagnetic sensor prototype to assist visually [37] B. Buurman, J. Kamruzzaman, G. Karmakar, and S. Islam, “Low-power
impaired and blind people in autonomous walking,” IEEE Sensors wide-area networks: design goals, architecture, suitability to use cases
Journal, vol. 18, no. 6, pp. 2568-2576, Mar. 2018. and research challenges,” IEEE Access, vol. 8, no. 1, pp. 17179-17220,
[16] J. Villanueva and R. Farcy, “Optical device indicating a safe free path to 2020.
blind people,” IEEE Transactions on Instrumentation and Measurement, [38] Q. M. Qadir, T. A. Rashid, N. K. Al-Salihi, B. Ismael, A. A. Kist, and Z.
vol. 61, no. 1, pp. 170-177, Jan. 2012. Zhang, “Low power wide area networks: a survey of enabling
[17] S. Degaonkar, M. Gupta, P. Hiwarkar, M. Singh, and S. A. Itkar, “A smart technologies, applications and interoperability needs,” IEEE Access, vol.
walking stick powered by artificial intelligence for the visually impaired,” 6, no. 1, pp. 77454-77473, 2018.
International Journal of Computer Applications, vol. 178, no. 32, pp. [39] Y.-S. Chou, Y.-C. Mo, J.-P. Su, W.-J. Chang, L.-B. Chen, J.-J. Tang, and
7-10, Jul. 2019. C.-T. Yu, “i-Car system: a LoRa-based low power wide area networks
[18] S. Pruthvi, P. S. Nihal, R. R. Menon, S. S. Kumar, and S. Tiwari, “Smart vehicle diagnostic system for driving safety,” in Proceedings of the 2017
blind stick using artificial intelligence,” International Journal of IEEE International Conference on Applied System Innovation (ICASI),
Engineering and Advanced Technology (IJEAT), vol. 8, no. 5S, pp. 19-22, 2017, pp. 789-791.
May 2019. [40] Megawin Technology Co., Ltd., “MPC82G516AD Datasheet,” [Online]
[19] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: Available: URL:
unified, real-time object detection,” in Proceedings of the 2016 IEEE [Link]
Conference on Computer Vision and Pattern Recognition (CVPR), 2016, [Link]
pp. 779-788. [41] Raspberry Pi. [Online] Available: URL: [Link]
[20] J. Bai, S. Lian, Z. Liu, K. Wang, and D. Liu, “Smart guiding glasses for [42] MediaTek labs, “LinkIt One development kit,” [Online] Available: URL:
visually impaired people in indoor environment,” IEEE Transactions on [Link]
Consumer Electronics, vol. 63, no. 3, pp. 258-266, Aug. 2017.
[21] J. Bai, S. Lian, Z. Liu, K. Wang, and D. Liu, “Virtual-blind-road
following-based wearable navigation device for blind people,” IEEE
Transactions on Consumer Electronics, vol. 64, no. 1, pp. 136-143, Feb. Wan-Jung Chang (M’16) obtained a B.S.
2018. degree in electronic engineering from Southern
[22] C.-W. Lee, P. Chondro, S.-J. Ruan, O. Christen, and E. Naroska, Taiwan University of Science and Technology,
“Improving mobility for the visually impaired: a wearable indoor Tainan, Taiwan, R.O.C., in 2000, an M.S. degree in
positioning system based on visual marker,” IEEE Consumer Electronics computer science and information engineering from
National Taipei University of Technology, Taipei,
Magazine, vol. 7, no. 3, pp. 12-20, 2018.
Taiwan, R.O.C., in 2003, and a Ph.D. in electrical
[23] A. Aladrén, G. López-Nicolás, L. Puig, and J. J. Guerrero, “Navigation
engineering from the National Cheng Kung
assistance for the visually impaired using RGB-D sensor with range
University, Tainan, Taiwan, R.O.C., in 2008. He is
expansion,” IEEE Systems Journal, vol. 10, no. 3, pp. 922-932, Sep. currently an Associate Professor in the Department
2016. of Electronic Engineering at Southern Taiwan University of Science and
[24] A. J. Ramadhan, “Wearable smart system for visually impaired people,” Technology, Tainan, Taiwan, R.O.C. He is also the director of Artificial
Sensors, vol. 18, no. 3, article 843, Mar. 2018. Intelligence in the Internet of Things Applied Research Center (AIoT Center)
[25] W. M. Elmannai and K. M. Elleithy, “A highly accurate and reliable data and the Internet of Things Laboratory (IoT Lab) at Southern Taiwan
fusion framework for guiding the visually impaired,” IEEE Access, vol. 6, University of Science and Technology, Tainan, Taiwan, R.O.C. He received
no. 1, pp. 33029-33054, 2018. the Best Paper Award at IEEE ChinaCom 2009, the Best Paper Award at
[26] D. Croce, L. Giarré, F. Pascucci, I. Tinnirello, G. E. Galioto, D. Garlisi, ICCPE 2016, the 1st Prize for the Excellent Demo! Award at IEEE GCCE
and A. L. Valvo, “An indoor and outdoor navigation system for visually 2016–2018, and the 1st Prize for the Best Paper Award at IEEE ICASI 2017.
impaired people,” IEEE Access, vol. 7, no. 1, pp. 170406-170418, 2019. His research interests include cloud/IoT/AIoT systems and applications,
[27] R. Tapu, B. Mocanu, and T. Zaharia, “DEEP-SEE: joint object detection, protocols for heterogeneous networks, and WSN/high-speed network design
tracking and recognition with application to visually impaired and analysis. He is a member of the IEEE.
navigational assistance,” Sensors, vol. 17, no. 11, article 2473, Oct. 2017.
[28] S. Lin, R. Cheng, K. Wang, and K. Yang, “Visual localizer: outdoor
localization based on ConvNet descriptor and global optimization for
visually impaired pedestrians,” Sensors, vol. 18, no. 8, article 2476, Jul. Liang-Bi Chen (S’04–M’10–SM’16) received
2018. B.S. and M.S. degrees in electronic engineering from
[29] K. Yang, K. Wang, L. M. Bergasa, E. Romera, W. Hu, D. Sun, J. Sun, R. the National Kaohsiung University of Applied
Cheng, T. Chen, and E. López, “Unifying terrain awareness for the Sciences, Kaohsiung, Taiwan, in 2001 and 2003,
visually impaired through real-time semantic segmentation,” Sensors, respectively, and a Ph.D. in electronic engineering
vol. 18, no. 5, article 1506, May 2018. from the Southern Taiwan University of Science and
[30] L.-B. Chen, J.-P. Su, M.-C. Chen, W.-J. Chang, C.-H. Yang, and C.-Y. Technology, Tainan, Taiwan, in 2019. From 2004 to
Sie, “An implementation of an intelligent assistance system for visually 2010, he was enrolled in the Computer Science and
impaired/blind people,” in Proceedings of the 2019 IEEE International Engineering Ph.D. Program with National Sun
Conference on Consumer Electronics (ICCE), 2019, pp. 1-2. Yat-Sen University, Kaohsiung. In 2008, he had an
[31] SHARP Electronic Components, “GP2Y0A710K0F: Distance measuring internship with the Department of Computer Science, National University of
sensor unit measuring distance: 100 to 550 cm analog output type,” Singapore, Singapore. He was also a Visiting Researcher with the Department
SHARP GP2Y0A710K0F Datasheet, [Online] Available: URL: of Computer Science, University of California at Irvine, Irvine, CA, USA, from
[Link] 2008 to 2009, and with the Department of Computer Science and Engineering,
0k_e.pdf Waseda University, Tokyo, Japan, in 2010. In 2012, he joined BXB Electronics
[32] Wikipedia, “Triangulation,” [Online] Available: URL: Company Ltd., Kaohsiung, as a Research and Development Engineer, where he
[Link] was an Executive Assistant to the Vice President, from 2013 to 2016. In 2016,
[33] InvenSense, “MPU-6050 six-axis (Gyro + Accelerometer) MEMS he joined the Southern Taiwan University of Science and Technology, as a
MotionTracking™ devices,” [Online] Available: URL: Research Manager and an Adjunct Assistant Professor. He is a member of the
[Link] IEICE and PMI. He received the 1st Prize for the Best Paper Award from the
/. IEEE ICASI 2017, Honorable Mention at IEEE ICAAI 2016, the 1st Prize for
1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 12
the Excellent Demo! Award from IEEE GCCE 2016–2018, and the 2nd Prize for
the Excellent Poster Award from IEEE GCCE 2016. He was a recipient of the
2014 IEEE Education Society Student Leadership Award, the 2019 Young
Scholar Potential Elite Award of the National Central Library, Taiwan, and the
2018-2019 Publons Peer Review Award. He has also served as a TPC Member,
an IPC Member, and a reviewer for many IEEE/ACM international conferences
and journals. Since 2019, he has served as an Associate Editor for IEEE Access
and a Guest Editor for Energies and Applied Sciences Journals, respectively.
1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.