0% found this document useful (0 votes)
16 views12 pages

Intelligent Assistive System for the Visually Impaired

Uploaded by

mohitkrnwd25
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views12 pages

Intelligent Assistive System for the Visually Impaired

Uploaded by

mohitkrnwd25
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

This article has been accepted for publication in a future issue of this journal, but has not been

fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 1

Design and Implementation of an Intelligent


Assistive System for Visually Impaired People
for Aerial Obstacle Avoidance and Fall Detection
Wan-Jung Chang, Member, IEEE, Liang-Bi Chen, Senior Member, IEEE, Ming-Che Chen, Member,
IEEE, Jian-Ping Su, Cheng-You Sie, and Ching-Hsiang Yang

Abstract—This paper proposes an intelligent assistive system


based on wearable smart glasses and an intelligent walking stick
for visually impaired people to achieve the goals of aerial obstacle
avoidance and fall detection. The proposed assistive system
comprises wearable smart glasses, an intelligent walking stick, a
mobile device app, and a cloud-based information management
platform. Visually impaired people can wear the proposed
wearable smart glasses and hold the proposed intelligent walking
stick to detect aerial obstacles and fall events on roads. Moreover,
the proposed intelligent walking stick can vibrate to guide visually
impaired people to avoid aerial obstacle collision accidents.
Experimental results show that the proposed system can detect
aerial obstacles within 3 meters, and the average accuracy of fall Fig. 1. The proportion of visually impaired people in the total world population.
detection reaches up to 98.3%. Furthermore, when visually
impaired people experience a fall event, an urgent notification is
immediately sent to their family members or caregivers.

Index Terms—Fall detection, Internet of Things (IoT), smart


glasses, obstacle avoidance, visually impaired, walking stick,
I. INTRODUCTION
wearable assistive devices.

Manuscript received February 26, 2020; revised April 13, 2020; accepted
R ECENTLY, the World Health Organization (WHO),
released a global population statistical report [1] in 2017
noting that of the approximately 7.338 billion people in the
April 23, 2020
This work was supported in part by the Ministry of Science and Technology world, approximately 285 million people were visually
(MoST), Taiwan, R.O.C., under the following grants: MOST impaired as of the end of 2016. Fig. 1 shows the proportion of
108-2637-E-218-004 and MOST 108-2622-8-218-004-TE2. This work was
also supported in part by the Ministry of Education (MoE), Taiwan, R.O.C.,
visually impaired people in the total world population. Among
under Grant 1300-107P735 through the Advanced Intelligent Biomedical these visually impaired people, 39 million were blind, and 246
Allied Research Center under the Higher Education Sprout Project. An earlier million had amblyopia, together accounting for 4.25% of the
and briefer version of this paper was presented at the 2019 IEEE ICCE [30], Las
total population of the world.
Vegas, NV, USA, and was published (DOI: 10.1109/ICCE.2019.8661943) in
its Proceedings. (Corresponding author: Wan-Jung Chang.) Whether due to congenital inheritance, acquired diseases,
W.-J. Chang is with the Department of Electronic Engineering, Southern accidental injuries, or other reasons, visual impairment causes a
Taiwan University of Science and Technology, Tainan 71005, Taiwan, R.O.C. certain level of inconvenience to these lives. For example, it is
(e-mail: allenchang@[Link]).
L.-B. Chen is with the Department of Electronic Engineering, Southern very difficult for visually impaired people to walk
Taiwan University of Science and Technology, Tainan 71005, Taiwan, R.O.C. independently in strange and complex areas.
(e-mail: [Link]@[Link]). Currently, visually impaired people mostly use white sticks,
M.-C. Chen is with the Department of Electronic Engineering, Southern
Taiwan University of Science and Technology, Tainan 71005, Taiwan, R.O.C. guide dogs, or voice guidance devices (built-in A-GPS) to
(e-mail: jerryhata@[Link]). provide directions to improve their walking safety when they
J.-P. Su is with the Department of Electronic Engineering, Southern Taiwan go out independently. However, in complex outdoor
University of Science and Technology, Tainan 71005, Taiwan, R.O.C. (e-mail:
jianpingsu1111@[Link]). environments, visually impaired people still face the following
C.-Y. Sie is with the Department of Electronic Engineering, Southern two common dangerous walking situations [2]-[5].
Taiwan University of Science and Technology, Tainan 71005, Taiwan, R.O.C. 1) Collisions with Aerial Obstacles in Front
(e-mail:)
C.-H. Yang is with the Department of Electronic Engineering, Southern
According to Manduchi and Kurniawan [3], 15% of visually
Taiwan University of Science and Technology, Tainan 71005, Taiwan, R.O.C. impaired people hit obstacles every month on average, and 40%
(e-mail: 4a339036@[Link]) of visually impaired people fall down every year because they

1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 2

impaired people.
To address fall detection and prevention, Chaccour et al. [11]
performed a generic classification of existing fall-related
systems from fall detection to fall prevention. They also
suggested some future development directions for assistive
devices for visually impaired people. Kau and Chen [12]
proposed a fall accident detection system based on a
smartphone with 3G mobile networks. A state machine was
implemented for fall detection, and the corresponding features
were investigated in a sequential manner.
Generally, walking sticks have become a basic aid for
visually impaired people. Several related works [13]-[19]
developed an intelligent walking stick to assist visually
impaired people in obstacle avoidance or walking guidance.
Wang and Kuchenbecker [13] presented an assistive device
called HALO, which was mounted on a traditional white stick
to alert for low-hanging obstacles via haptic cues. Sharma et al.
[14] designed a smart stick that assisted visually impaired
Fig. 2. Two examples of front aerial obstacles. people in safely moving around holes, water, obstacles, and
stairs. Cardillo et al. [15] designed an electromagnetic
sensor-based system that was mounted on a conventional white
hit obstacles. In particular, aerial obstacles, such as awnings, stick to assist with the autonomous walking of visually
tree branches, and similar objects, typically have no projection impaired people. Villanueva and Farcy [16] implemented an
on the ground or floor [4]. Moreover, these aerial obstacles
active optical-based electronic travel assistive device that used
cannot be detected by a general walking stick. Two examples
a photodiode and an LED. This device was integrated with a
of aerial obstacles are shown in Fig. 2.
general white stick.
2) High Fall Risk
Currently, artificial intelligence (AI) techniques have been
Legood et al. [5] pointed out that a visually impaired
applied to walking sticks to create smart walking sticks [17],
person’s chance of falling is 1.7 times that of the average
[18]. For example, Degaonkar et al. [17] presented a smart stick
person, and the probability of subsequent falling after standing
that used AI with ultrasonic sensors to help visually impaired
up is 1.9 times greater. When a visually impaired person walks
people navigate. Image recognition, collision detection, and
alone and falls, he may feel less confident and helpless when
obstacle detection were provided by the presented smart stick.
walking alone, potentially resulting in serious injury.
Pruthvi et al. [18] designed a smart blind stick that adopts a
To overcome these two issues mentioned above, we design
popular deep learning training model, YOLO [19], to provide
and implement an intelligent assistive system for visually
object detection and classification. This device provides object
impaired people to provide functionalities for outdoor walking
information by voice for visually impaired people.
safety, including detection and notification of falls and
In recent years, many previous works have investigated
collisions with aerial obstacles.
guided walking and navigation applications. For example, Bai
The remainder of this paper is organized as follows. Section
et al. [20] designed an electronic travel aid (ETA) for which
II reviews related work. The proposed assistive system is
smart glasses were adopted as a major wearable device to assist
described in Section III. A prototype of the proposed assistive
with indoor travel for visually impaired people. The ETA was
system and related experiments are provided in Section IV.
implemented by integrating a depth sensor with an ultrasonic
Finally, Section V concludes this work and offers a discussion
sensor to solve the problem of detecting few obstacles.
of future research directions.
Moreover, they further developed a dynamic sub-destination
routing plan [21] that can assist visually impaired people in
II. RELATED WORK
bypassing obstacles in front and simultaneously guide them to
Many novel related studies [6]-[30] for assisting and helping their destination.
visually impaired people were reviewed. Elmannai and Elleithy Lee et al. [22] proposed a wearable glasses-based indoor
[6] surveyed the present development status of sensor-based positioning system. Ultrasound and image sensors were
assistive aids/devices and provided some possible future mounted on the wearable glasses. The proposed wearable
development trends. Tapu et al. [7] surveyed the state of the art glasses can identify certain color-coded markers in real time
in wearable assistive device development prior to 2018. Islam and detect obstacles in front within a distance of 15 meters.
et al. [8] reviewed the development of assistive walking Aladrén et al. [23] presented a navigation assistive device
devices. Hakobyan et al. [9] surveyed the modern development that adopted a consumer RGB-D image sensor to take
of mobile assistive technologies for visually impaired people. advantage of both wide range and color visual information for
Furthermore, Zubov [10] proposed a thin client-based concept detecting obstacle-free paths.
of a smart city assistive infrastructure for blind and visually Ramadhan [24] proposed a wearable system composed of a

1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 3

microcontroller board, a solar panel, some sensors, mobile


communication, and GPS modules. The proposed wearable
system employed a set of sensors to trace the path and to warn
of obstacles in front for visually impaired users.
Elmannai and Elleithy [25] integrated sensor fusion,
computer vision, and fuzzy logic techniques to provide accurate
multiobject detection for collision avoidance. Croce et al. [26]
presented an indoor/outdoor navigation system in which both a
camera and inertial sensors were integrated into a smartphone.
Some current works [27]-[29] have applied popular deep
learning techniques (such as object detection and semantic
segmentation) for navigation assistance and obstacle
avoidance. However, these works do not seem to meet the Fig. 3. Construction of the proposed intelligent assistive system.
necessary feasibility for the current lightweight
microprocessors and battery life-aware conditions for assistive A possible application scenario of the proposed intelligent
devices of visually impaired users due to their assistive system is demonstrated in Fig. 4. First, a visually
computing-heavy requirements and high power consumption impaired user needs to wear the proposed wearable smart
for performing the deep learning techniques. glasses and hold the proposed intelligent walking stick on the
As a result, a complete interoperated aid suite to provide road; then, aerial obstacles in front of the user can be detected
walking safety for visually impaired patients is still lacking. by the proposed wearable smart glasses, and the visually
Considering the current feasibility of an assistive device impaired user can be notified through the intelligent walking
application scenario for visually impaired users, this paper stick.
proposes an intelligent assistive system based on wearable
smart glasses with an intelligent walking stick for visually
impaired people to achieve the goals of obstacle avoidance and
fall detection.

III. THE PROPOSED ASSISTIVE SYSTEM


A. Design Concept
The proposed intelligent assistive system aims to reduce the
chances of visually impaired people being injured from aerial
obstacles in front during walking and from falling. On the other
hand, when a visually impaired person falls, the system will
Fig. 4. A possible application scenario of the proposed intelligent assistive
also automatically send an urgent notification to the mobile
system.
phones of relevant persons (such as family members or
caregivers). Furthermore, if the visually impaired user experiences a fall
As shown in Fig. 3, the proposed assistive system is down or collision event, the related information (location, time,
composed of a pair of wearable smart glasses, an intelligent fall down, etc.) will be recorded and uploaded to the proposed
walking stick, a mobile device app (named V-Protector), and a cloud-based information management platform. The related
cloud-based information management platform. A prototype of information will also be immediately sent to relevant persons
the proposed assistive system is demonstrated in Section IV. (for example, family members or caregivers) by the proposed
The proposed wearable smart glasses are used to detect aerial mobile device app.
obstacles in front and falls. When aerial obstacles in front are
detected via the proposed wearable smart glasses, the proposed B. Front Aerial Obstacle Detection
intelligent walking stick will vibrate to notify and guide We focus on front aerial obstacle detection in this paper for
visually impaired people to avoid those obstacles. the following reasons. First, aerial obstacles are often hit
When a visually impaired person experiences a fall down without warning, and without protection, a hit to the head can
event, the location information of the fall down occurrence will cause serious injury or even death. Second, in general, for
be immediately transmitted to the proposed cloud-based ground obstacle detection, visually impaired people will use
management information platform. Moreover, the proposed their walking stick while walking to determine whether there
cloud-based management platform can provide membership are obstacles on the ground, and they usually receive walking
(user) account management, record the walking routes of orientation training. The design of front aerial obstacle
visually impaired users, and forward fall event messages to the detection will be introduced below.
proposed V-Protector app. The proposed V-Protector app can 1) Infrared (IR) Transceiver Sensor Module Adoption
be used to view information (such as location and time) of There are many sensors for measuring distance available in
visually impaired users. the market, and the most popular of these sensors are ultrasonic

1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 4

and IR transceiver sensors. We choose an IR transceiver sensor 2) Triangulation Method for Distance Detection of Aerial
for aerial obstacle detection in this paper because such sensors Obstacles in Front
are inexpensive, and visually impaired persons tend to be As shown in Fig. 7, we use the triangulation method [32] for
financially limited. distance detection of aerial obstacles in front. The main reason
As a result, we need to detect aerial obstacles by adopting an is that the detection distance of aerial obstacles in front is not
IR transceiver sensor. We choose an IR transceiver sensor easily affected by the reflectivity of various objects or the
module (SHARP GP2Y0A710K0F) [31] that can measure ambient temperature and operation time. The triangulation
distances between 1 and 5.5 meters. As shown in Fig. 5, the method is based on the assumption that reference points with
output voltage of the adopted IR transceiver sensor module known coordinates can form a triangle. Two known coordinates
corresponds to the detection distance. are those of the IR emitting diode (TX) and the position IR
Therefore, we can use the change in voltage to determine the sensor diode (RX) on the IR transceiver sensor module, and the
distance. The induction distance we set is 0.5 to 3 meters and measurement target point is an obstacle. As a result, we can
represents the distance to an object (obstacle) when the calculate the length of the reference side of the triangle and
corresponding voltage is more than 1.5 V. measure the angle between the two reference points and the
target point to find the distance and coordinates of the target
point.

Fig. 5. The relationship between the output voltage and the measured distance
[31].

The size of the IR transceiver sensor module is 22 mm × 8


mm × 7.2 mm (length × width × height), which is very suitable
for our proposed wearable smart glasses. This IR transceiver
Fig. 7. Distance detection of aerial obstacles in front by using the triangulation
sensor module does not have an excessive weight load, and its
method.
power consumption is also low. As shown in Fig. 6, the
relationship between the output voltage and the measured
distance is nearly linear within the range that the IR transceiver C. Fall Recognition by the Proposed Wearable Smart Glasses
sensor module can measure. First, a motion tracking sensor module (an MPU6050 is
adopted [33]) is used as a component to detect the posture of the
visually impaired user. This motion tracking sensor module
(6-axis sensor module) contains a 3-axis acceleration sensor
and a 3-axis gyroscope sensor so that these two sensing
components can be used to judge five postures (states) of the
human body, including standing, sitting, lying, running and
walking, as shown in Fig. 8.

Fig. 6. The relationship between the output voltage and the inverse of distance Fig. 8. The motion tracking sensor module (6-axis sensor module) is used to
[31]. judge five postures (states) of the human body, including standing, sitting,
lying, running and walking.

1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 5

degrees, it is classified as an upright action (walking, running,


Fall recognition by the proposed wearable smart glasses and standing). When the angle (θ) is greater than 30 degrees
involves five steps (feature acquisition and analysis, upright (θTh) or less than −30 degrees (−θTh), it is classified as a tilt
and tilt action classification, action model recognition, posture action (sitting, lying down, and falling), as shown in Fig. 10.
conversion recognition, and fall behavior recognition), which
are introduced as follows.
1) Feature Acquisition and Analysis
The accelerometer and gyro signals of the motion tracking
sensor module can obtain the vertical angle (θAcc) between the
upper human body (smart glasses) and the ground (g) during
movement, as shown in the following equation (1) and Fig. 9.
Please note that Ax and Az represent accelerations in the
x-direction and y-direction, respectively.

x z
arctan (1)

Fig. 10. Upright and tilt action classifications.

3) Action Model Recognition


The fourth-order Runge-Kutta [36] integral equation (4) is
used to calculate the vertical velocity x(tk+1) of the upper human
body in the past k samples, and the two parameters of the tilt
angle θ and x(tk+1) of the human body can recognize six human
postures (standing, sitting, lying, running, walking, and
falling).

Fig. 9. The accelerometer and gyro signals of the motion tracking sensor
x(t k + 1) ≈ x(tk ) + (Δ x 1 + 2 Δ x 2 + 2 Δ x 3 + Δ x 4)/6 (4)
module can obtain the vertical angle (θAcc) between the upper human body
(smart glasses) and the ground (g) during movement. 4) Posture Conversion Recognition
Then, we calculate the maximum difference (XMaxDiff)
Hence, the vertical angle (θAcc) is added to the gyro sensor of between the maximum and minimum vertical acceleration
the motion tracking sensor module to measure the value of the measured during each sampling period i (XMaxDiff =
angle θGyro, which is calculated using the multisensor data Max{ai}-Min{ai}, ∀i=1, 2, 3, ……, n). In the ith sampling
fusion Kalman filtering model [35] to obtain the corrected period, if XMaxDiff>Dth (preset threshold value), then the
upper human behavior tilt angle θ value, as shown in equations posture conversion is determined, and an action is performed
(2) and (3). for recognition of motion patterns in the ith sampling period.
The Kalman filtering model is used to filter noise from the Through this recognition, it is known that if the acceleration
accelerometer. Please note that uk means the input data and bias of the motion sensor module configured in the wearable smart
means the bias of the gyro measurement. glasses is greater than a certain preset threshold value Dth (i.e.,
XMaxDiffx>Dth) during the ith sampling period, this means that
dt (2) the wearable smart glasses are currently in a state of a rapid
change in tilt.
1 dt θ dt Next, we supplement the posture conversion recognition to
(3)
0 1 0 determine the fall posture, which can improve the accuracy of
fall detection and reduce the probability of misjudgment. In
2) Upright and Tilt Action Classification other words, we integrate the center of the human torso with the
When we obtain the value of the corrected upper human posture (state) changes for standing, sitting, lying, running, and
behavior tilt angle θ, a threshold value of the angle can be used walking to determine the fall posture (state), as shown in Fig.
for classification. The threshold value of the angle is set to 30 11.
degrees. When the measured value of the angle is less than 30

1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 6

change in tilt.
Power
On
Upright
Stage
Tilt
Stage
2) Feature Acquisition and Analysis
Similar to that of the wearable smart glasses, the
accelerometer signal of the motion tracking sensor module can
lie
state
stand
state
walk
state
run
state
obtain the vertical angle (θAcc) between the proposed intelligent
walking stick and the ground (g) during movement, as shown in
equation (1) and Fig. 12.
sit fall
state state

Fig. 11. Posture (state) changes for standing, sitting, lying, running, and
walking to determine the fall posture (state).

5) Fall Behavior Recognition


The corrected (adjusted) human behavior tilt angle θ,
XMaxDiff, the change in the kth sampled human vertical
velocity X (tk+1), and posture conversion recognition are used
to develop a high-accuracy fall behavior recognition algorithm.
Finally, this fall behavior recognition algorithm is implemented
in firmware in the microcontroller unit (MCU) of the proposed
wearable smart glasses.
D. Fall Recognition by the Proposed Intelligent Walking Stick
Fall recognition by the proposed intelligent walking stick is
similar to that by the proposed wearable smart glasses. The Fig. 12. The accelerometer signal of the motion tracking sensor module can
proposed intelligent walking stick uses the same motion obtain the vertical angle (θAcc) between the proposed intelligent walking stick
and the ground (g) during movement.
tracking sensor module.
The proposed wearable smart glasses and intelligent walking
stick simultaneously detect the posture of the visually impaired
The Y-axis data of the accelerometer in the motion sensor
user and recognize whether it matches a fall event to greatly
module are continuously sampled to determine the current
enhance the accuracy of fall event recognition.
position state of the proposed intelligent walking stick from the
Generally, visually impaired people are different from
angle judgment. If the angle measured by the motion sensor
normal people because they always carry a walking stick as an
module approaches “0”, then the current walking stick position
assistive device when walking. Hence, when a visually
is lying flat on the ground.
impaired person falls, the position of the walking stick will
3) Fall Behavior Recognition
change (such as when the walking stick is tilted quickly and
When the two conditions of steps 1 and 2 are established, this
lying flat on the ground).
can represent that the visually impaired user holding the
Fall recognition by the proposed intelligent walking stick
walking stick may have fallen; that is, the walking stick will lie
involves three steps (posture conversion recognition, feature
on the ground after the state of the walking stick is rapidly
acquisition and analysis, and fall behavior recognition), which
inclined when a fall occurs. Finally, this fall behavior
are introduced as follows.
recognition algorithm is implemented in firmware in the
1) Posture Conversion Recognition
microprocessor (MPU) of the proposed intelligent walking
We calculate the maximum difference (XMaxDiff) between
stick.
the maximum and minimum vertical acceleration of the motion
sensor module measured during each fixed sampling period i E. Fusion of the Two Fall Recognition Methods
(XMaxDiff = Max{ai}-Min{ai}, ∀i=1, 2, 3, ……, n) to To avoid misjudgment by the proposed intelligent assistive
determine the change in the current position of the walking system, we fuse the two abovementioned fall recognition
stick. In the ith sampling period, if XMaxDiff>Dth (preset methods. When the proposed intelligent walking stick
threshold value), then the posture conversion is determined, recognizes a fall event and a fall signal is also received from the
and an action is performed for recognition of motion patterns in proposed wearable smart glasses, then it is determined to be a
the ith sampling period. Through this recognition, it is known real fall event, and the current GPS information is immediately
that if the acceleration of the motion sensor module configured transmitted to the proposed cloud-based information
in the walking stick is greater than a certain preset threshold management platform by a low-power wide-area network
value Dth (i.e., XMaxDiffx>Dth) during the ith sampling period, (LPWAN) [37], [38].
this means that the walking stick is currently in a state of a rapid

1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 7

TABLE I
DISTANCE CONVERSION VOLTAGE MEASUREMENTS BY THE ADOPTED IR TRANSCEIVER SENSOR MODULE
Minimum Output Maximum Output Minimum Output Maximum Output Minimum Output Maximum Output
Distance Average Output
Voltage (V) Voltage (V) Voltage (V) Voltage (V) Voltage (V) Voltage (V)
(cm) Voltage (V)
@ 47% RH @ 47% RH @ 71% RH @ 71% RH @ 95% RH @ 95% RH
50 3.09 3.14 3.06 3.11 3.05 3.09 3.09
100 2.48 2.54 2.46 2.53 2.43 2.46 2.48
150 2.23 2.26 2.23 2.25 2.21 2.24 2.24
200 1.68 1.70 1.68 1.69 1.67 1.68 1.68
250 1.56 1.63 1.54 1.60 1.53 1.57 1.57
300 1.51 1.57 1.53 1.58 1.50 1.55 1.54
350 1.48 1.53 1.46 1.52 1.47 1.50 1.49
400 1.47 1.52 1.45 1.51 1.43 1.46 1.47
450 1.45 1.48 1.40 1.43 1.38 1.41 1.43
500 1.38 1.43 1.36 1.39 1.21 1.31 1.35

Start

Smart glasses send aerial


obstacle detection signal to
walking stick via BLE
Smart glasses Walking stick
Power on Power on

The walking stick receives the


BLE is automatically signal and sends out vibration to
pairing. remind the visually impaired

Whether the BLE


No
connection is
successful?
Has the visually No
impaired person
Yes avoided the aerial
obstacle in front?

Performing fall
recognition
IR detects aerial obstacles. Yes

The vibration motor stops


vibrating and IR continues to
detect aerial obstacles in front.
Do the 6-axis sensors No No Yes
on the smart glasses Are there aerial
and the walking stick obstacles within 3
simultaneously detect meters?
a fall state?

Yes
Uploading fall event and its GPS
location to the cloud-based
information management
platform via LoRa-based LPWAN

Forward the fall event message


to the proposed mobile device
app via mobile networks.

Fig. 13. The flowchart of the proposed intelligent assistive system.

Hence, the location and time of occurrence are uploaded to from the proposed wearable smart glasses, this fall event will be
the proposed cloud-based information management platform ignored due to misjudgment by the smart glasses. Through the
for storage, and a message is forwarded to the V-Protector app fusion of the two fall recognition methods of the proposed
to alert the family members or caregivers of the visually wearable smart glasses and the proposed intelligent walking
impaired user. However, if the proposed intelligent walking stick, the fall event recognition rate can be improved;
stick does not recognize a fall event but receives a fall signal moreover, false alarms can be avoided.

1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 8

F. LoRa-Based Low-Power Wide-Area Network (LPWAN) the standard functions of the 8051 MCU, many system-level
Visually impaired people mostly encounter dangerous events functions have been integrated into this MCU.
outdoors, they usually do not have the habit of carrying a These functions can effectively reduce the circuit board area
smartphone, and they always hold a walking stick to assist them and cost. In other words, we need a miniature circuit board for
while walking on outdoor roads. Hence, it is not suitable for a embedding circuit modules into the proposed wearable smart
visually impaired user to need a smartphone to send fall event glasses. To achieve this purpose, we design and implement the
messages or to seek assistance. Based on the abovementioned PCB, including an MCU, a BLE module, a 6-axis sensor
reasons, we implement the function of fall event announcement module, and a battery charging module, as seen in Fig. 14 (b) to
on the proposed intelligent walking stick via LoRa-based (d).
LPWAN [37]-[39]. The proposed intelligent walking stick does not have space
Finally, the flowchart of the proposed intelligent assistive limitations. Hence, some existing development boards (such as
system is shown in Fig. 13. Raspberry Pi [41] and LinkIt™ ONE [42]) can be selected and
used as an MPU for the proposed walking stick. Thus, we adopt
IV. PROTOTYPE DEMONSTRATION AND EXPERIMENTS the LinkIt™ One development board as the MPU for the
walking stick
Figs. 14 and 15 demonstrate photographs of a prototype
The LinkIt™ ONE development board integrates a BLE
implementation of the proposed wearable smart glasses and the
module, a LoRa-based LPWAN communication module, a
proposed intelligent walking stick, respectively.
GPS module, a 6-axis (gyro + accelerometer) MEMS motion
Fig. 14 shows a prototype of the proposed wearable smart
tracking sensor, and a vibration motor module, where the BLE
glasses. As shown in Fig. 14, an IR transceiver sensor module
module is responsible for pairing and connecting with the
(see Fig. 14 (a)), a 6-axis (gyro + accelerometer)
proposed wearable smart glasses.
microelectromechanical system (MEMS) motion tracking
When the intelligent walking stick detects an aerial obstacle
sensor module (see Fig. 14 (d)), a microcontroller unit (MCU)
from the proposed wearable smart glasses, the vibration motor
(see Fig. 14 (c)), a Bluetooth low energy (BLE) wireless
is actuated to warn the visually impaired person via vibration.
communication module (see Fig. 14 (d)), and a battery charging
Moreover, the LoRa-based LPWAN module is responsible for
module (see Fig. 14 (b)) are mounted on the wearable smart
uploading the fall information with GPS to the proposed
glasses. The proposed wearable smart glasses and the proposed
cloud-based information management platform, in which the
intelligent walking stick are connected and paired by BLE
6-axis (gyro + accelerometer) MEMS motion tracking sensor
wireless communication to achieve the fusion of the two fall
mounted on the proposed intelligent walking stick is
recognition methods.
responsible for co-recognition with the proposed wearable
smart glasses. Hence, the current posture of the visually
impaired person can be recognized to determine whether the
person has truly fallen.
Fig. 15 shows a photograph of a prototype of the proposed
intelligent walking stick. As shown in Fig. 15, a vibration
motor, a GPS module, an MPU, a BLE wireless communication
module, a LoRa-based LPWAN wireless communication
module, and a 6-axis (gyro + accelerometer) MEMS motion
tracking sensor module are mounted on a walking stick.

Fig. 14. Photographs of the prototype of the proposed wearable smart glasses.

The proposed wearable smart glasses use the 8051-like MCU


(MPC82G516AD [40]) as a processing core to receive the
analog voltage of the IR transceiver sensor module. After A/D
conversion and signal processing, the transmitter (TX) of the
BLB module sends the signal to the receiver (RX) of the BLE Fig. 15. Photograph of the prototype of the proposed wearable intelligent
module the proposed intelligent walking stick. In addition to walking stick.

1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 9

TABLE II
EXPERIMENTS ON POSTURE CONVERSION RECOGNITION FOR THREE SUBJECTS
Standing Sitting Lying Running Walking Falling Summary
Success (count) 138 137 150 145 141 140 851 times (total)
Total (count) 150 150 150 150 150 150 900 times (total)
Accuracy 92.00% 91.33% 100.00% 96.67% 94.00% 93.33% 94.56% (total)

TABLE III
EXPERIMENTS ON FALL DETECTION FOR THREE SUBJECTS

Subject 1 Subject 2 Subject 3 Summary

Action Tests (count) 300 300 300 900 times (total)


Falling Actions (count) 60 60 60 180 times (total)
Fall Detection by the Proposed Smart Glasses (count) 68 70 66 Misjudged 18 times (total)
Fall Detection by the Proposed Intelligent Walking Stick (count) 61 65 61 Misjudged 7 times (total)
Fall Detection Fusion (Smart Glasses and Stick) (count) 60 62 61 Misjudged 3 times (total)
Accuracy Rate of Fall Detection by the Proposed Smart Glasses (%) 86.6% 83.3% 90.0% 90.0% (average)
Accuracy Rate of Fall Detection by the Proposed Intelligent Walking Stick (%) 98.3% 91.7% 98.3% 96.1% (average)
Accuracy Rate of Fall Detection Fusion (Smart Glasses and Walking Stick) (%) 100% 96.6% 98.3% 98.3% (average)

For the distance detection of aerial obstacles in front, the IR


transceiver sensor module will output the actual measured Next, we performed tests for the proposed fall detection
distance as a voltage for the MCU/MPU to determine the recognition. We looked for three subjects to wear the proposed
distance. Therefore, for experiments, we actually wore the wearable smart glasses and hold the proposed intelligent
smart glasses and measured 10 units in sequence in a 0.5 meter walking stick. These three subjects tested 300 common actions,
unit, i.e., 50 cm to 500 cm distance with each 50 cm distance as including walking, standing, sitting, and falling; moreover,
a unit. We measured the outdoor environment from morning to falling actions (events) were conducted 60 times for each
night for a whole day and measured it 30 times every two hours. subject. Each test for a subject takes one minute as a continuous
The relative humidity (RH) was between 47% and 95%. The action pattern test (e.g., standing  walking  sitting 
temperature was between 16 degrees C and 24 degrees C. standing  walking  falling).
We measured and recorded the distance conversion voltage Table III shows the results of the fall detection experiments
from the adopted IR transceiver sensor module, as shown in for the three subjects. As shown in Table III, the average
Table I. The distances detected during the tests were not easily accuracy rate of fall detection by the proposed smart glasses
affected by the shape of the object or the reflectance of the varies between 86.6% and 90.0%. The accuracy rate of fall
object but were slightly affected by the RH. detection by the proposed intelligent walking stick varies
Based on the walking safety and response time of visually between 83.3% and 98.9%. The accuracy rate of fall detection
impaired users, we set the effective detection distance to 3 by the fusion of the smart glasses and stick varies between
meters (the threshold voltage is set to 1.5 V), which means that 98.3% and 100%. Finally, the average accuracy of fall
the visually impaired user is warned by the vibration motor detection can reach up to 98.3%. As a result, the fall event
when there is an aerial obstacle in front within 3 meters. As a recognition rate can be improved by the fusion of the two
result, visually impaired users can avoid aerial obstacles in assistive devices (smart glasses and walking stick), and false
front within 3 meters. alarms of fall events can also be avoided.
We performed tests for posture conversion recognition and Fig. 16 shows an execution screen of the proposed
looked for three subjects to wear the proposed wearable smart cloud-based information management platform. We used PHP
glasses. These three subjects were tested 50 times for each and MySQL to write the proposed cloud-based information
posture, including standing, sitting, lying, running, walking, management platform. When a visually impaired user
and falling. Table II shows the experimental results of posture experiences a fall event, the GPS location and related
conversion recognition obtained for the three subjects. As information can be sent to the proposed cloud-based
shown in Table II, the six postures were recorded nine hundred information management platform via LoRa-based LPWAN
times for the three subjects, with 851 successes. The accuracy wireless communication. Then, a fall alert notification is sent to
rates of these postures (standing, sitting, lying, running, the V-Protector app to report that the visually impaired user
walking, and falling) are 92.00%, 91.33%, 100.00%, 96.69%, may have fallen or had an accident, and emergency treatment
94.00%, and 93.33%, respectively. In summary, the total can be performed as soon as possible to achieve the function of
accuracy rate is 94.56%. back-end platform services.

1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 10

Fig. 17 demonstrates execution screens of the proposed proposed intelligent walking stick, the fall event recognition
mobile device application (V-Protector). Fig. 16 (b) shows the rate can be improved; moreover, false alarms can be avoided.
screen reporting notification messages of accident occurrence As a result, the experimental results show that the proposed
for the visually impaired user. Fig. 16 (c) shows the screen system can efficiently detect aerial obstacles within a distance
reporting the location of the accident occurrence for the of 3 meters, and the average accuracy of fall detection reaches
visually impaired user. up to 98.3% by using the fusion of the two fall recognition
In other words, the visually impaired user will be mapped to methods.
their family members’ or caregivers’ smartphone through the Furthermore, when visually impaired people experience a
“Token” value in the event of an accident, and an emergency fall event, an urgent notification will immediately be sent to
message can be sent to the V-Protector app via the FireBase their family members or caregivers via the proposed mobile
push function to alert the family members or caregivers. In device app. When aerial obstacles in front are detected, the
addition, the intelligent walking stick will upload GPS latitude proposed intelligent stick will vibrate to guide visually
and longitude to the cloud-based information management impaired people to avoid collision accidents. In future work, we
platform every 5 minutes. As a result, the family members or will integrate deep learning techniques for recognizing front
caregivers can view the location of the visually impaired user at aerial and ground images (such as traffic signs and traffic cones)
any time in the V-Protector app. for obstacle avoidance and develop intelligent guided
walking-related functions.

ACKNOWLEDGMENTS
The authors would like to thank the Suang Lien Foundation
for the Visually Impaired, Taipei, Taiwan, R.O.C., and the
Tainan Blind Welfare Association, Tainan, Taiwan, R.O.C., for
providing many valuable comments and suggestions to this
work.

REFERENCES
[1] World Health Organization (WHO), “Visual impairment and blindness
Research,” Oct. 11 2017. [Online] Available:
URL: [Link]
[2] W.-J. Chang, J.-P. Su, L.-B. Chen, M.-C. Chen, C.-H. Hsu, C.-H. Yang,
C.-Y. Sie, and C.-H. Chuang, “An AI edge computing based wearable
Fig. 16. Execution screen of the proposed cloud-based information assistive device for visually impaired people zebra-crossing walking,” in
management platform. Proceedings of the 2020 IEEE International Conference on Consumer
Electronics (ICCE), Jan. 2020, pp. 1-2.
[3] R. Manduchi and S. Kurniawan, “Watch your head, mind your step:
mobility-related accidents experienced by people with visual
impairment,” Technical Report, Univ. California, Santa Cruz, 2010.
[4] J. M. Sáez, F. Escolano, and M. A. Lozano, “Aerial obstacle detection
with 3-D mobile devices,” IEEE Journal of Biomedical and Health
Informatics, vol. 19, no. 1, pp. 74-80, Jan. 2015.
[5] R. Legood, P. Scuffham, and C. Cryer, “Are we blind to injuries in the
visually impaired? A review of the literature,” Injury Prevention, vol. 8,
pp. 155-160, 2002.
[6] W. Elmannai and K. Elleithy, “Sensor-based assistive devices for
visually-impaired people: current status, challenges, and future
directions,” Sensors, vol. 17, no. 3, article 565, pp. 1-42, Mar. 2017.
[7] R. Tapu, B. Mocanu, and T. Zaharia, “Wearable assistive devices for
visually impaired: A state of the art survey,” Pattern Recognition Letters,
Oct. 2018, [Link]
[8] M. M. Islam, M. S. Sadi, K. Z. Zamli, and M. M. Ahmed, “Developing
(a) (b) (c) walking assistants for visually impaired people: a review,” IEEE Sensors
Fig. 17. Execution screens of the proposed mobile device application. Journal, vol. 19, no. 8, pp. 2814-2828, Apr. 2019.
[9] L. Hakobyan, J. Lumsden, D. O’Sullivan, and H. Bartlett, “Mobile
assistive technologies for the visually impaired,” Survey of
Ophthalmology, vol. 58, no. 6, pp. 513-528, Nov.-Dec. 2013.
V. CONCLUSIONS AND FUTURE WORK [10] D. Zubov, “A smart city assistive infrastructure for the blind and visually
In this paper, an intelligent assistive system has been impaired people: a thin client concept,” Broad Research in Artificial
Intelligence and Neuroscience (BRAIN), vol. 9, no. 4, pp. 25-37, Nov.
proposed for visually impaired people for aerial obstacle 2018.
avoidance and fall detection. The proposed intelligent assistive [11] K. Chaccour, R. Darazi, A. H. E. Hassani, and E. Andrès, “From fall
system comprises wearable smart glasses, an intelligent detection to fall prevention: a generic classification of fall-related
walking stick, a cloud-based information platform, and a system,” IEEE Sensors Journal, vol. 17, no. 3, pp. 812-822, Feb. 2017.
[12] L.-J. Kau and C.-S. Chen, “A smart phone-based pocket fall accident
mobile device app. Through the fusion of two fall recognition detection, positioning, and rescue system,” IEEE Journal of Biomedical
methods by the proposed wearable smart glasses and the and Health Informatics, vol. 19, no. 1, pp. 44-56, Jan. 2015.

1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 11

[13] Y. Wang and K. J. Kuchenbecker, “HALO: haptic alerts for low-hanging [34] H.-T. Lin, T.-J. Hsieh, M.-C. Chen, and W.-R. Chang, “ActionView: a
obstacles in white cane navigation,” in Proceedings of the IEEE Haptics movement-analysis ambulatory monitor in elderly homecare systems,” in
Symposium 2012, 2012, pp. 527-532. Proceedings of the 2009 IEEE International Symposium on Circuits and
[14] H. Sharma, M. Tripathi, A. Kumar, and M. S. Gaur, “Embedded assistive Systems (ISCAS), 2009, pp. 3098-3101.
stick for visually impaired persons,” in Proceedings of the 2018 9th [35] S.-L. Sun and Z.-L. Deng, “Multi-sensor optimal information fusion
International Conference on Computing, Communication and Kalman filter,” Automatica, vol. 40, no. 6, pp. 1017-1023, Jun. 2004.
Networking Technologies (ICCCNT), 2018, pp. 1-6. [36] Wikipedia, “Runge–Kutta methods,” [Online] Available: URL:
[15] E. Cardillo, V. D. Mattia, G. Manfredi, P. Russo, A. D. Leo, A. Caddemi, [Link]
and G. Cerri, “An electromagnetic sensor prototype to assist visually [37] B. Buurman, J. Kamruzzaman, G. Karmakar, and S. Islam, “Low-power
impaired and blind people in autonomous walking,” IEEE Sensors wide-area networks: design goals, architecture, suitability to use cases
Journal, vol. 18, no. 6, pp. 2568-2576, Mar. 2018. and research challenges,” IEEE Access, vol. 8, no. 1, pp. 17179-17220,
[16] J. Villanueva and R. Farcy, “Optical device indicating a safe free path to 2020.
blind people,” IEEE Transactions on Instrumentation and Measurement, [38] Q. M. Qadir, T. A. Rashid, N. K. Al-Salihi, B. Ismael, A. A. Kist, and Z.
vol. 61, no. 1, pp. 170-177, Jan. 2012. Zhang, “Low power wide area networks: a survey of enabling
[17] S. Degaonkar, M. Gupta, P. Hiwarkar, M. Singh, and S. A. Itkar, “A smart technologies, applications and interoperability needs,” IEEE Access, vol.
walking stick powered by artificial intelligence for the visually impaired,” 6, no. 1, pp. 77454-77473, 2018.
International Journal of Computer Applications, vol. 178, no. 32, pp. [39] Y.-S. Chou, Y.-C. Mo, J.-P. Su, W.-J. Chang, L.-B. Chen, J.-J. Tang, and
7-10, Jul. 2019. C.-T. Yu, “i-Car system: a LoRa-based low power wide area networks
[18] S. Pruthvi, P. S. Nihal, R. R. Menon, S. S. Kumar, and S. Tiwari, “Smart vehicle diagnostic system for driving safety,” in Proceedings of the 2017
blind stick using artificial intelligence,” International Journal of IEEE International Conference on Applied System Innovation (ICASI),
Engineering and Advanced Technology (IJEAT), vol. 8, no. 5S, pp. 19-22, 2017, pp. 789-791.
May 2019. [40] Megawin Technology Co., Ltd., “MPC82G516AD Datasheet,” [Online]
[19] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: Available: URL:
unified, real-time object detection,” in Proceedings of the 2016 IEEE [Link]
Conference on Computer Vision and Pattern Recognition (CVPR), 2016, [Link]
pp. 779-788. [41] Raspberry Pi. [Online] Available: URL: [Link]
[20] J. Bai, S. Lian, Z. Liu, K. Wang, and D. Liu, “Smart guiding glasses for [42] MediaTek labs, “LinkIt One development kit,” [Online] Available: URL:
visually impaired people in indoor environment,” IEEE Transactions on [Link]
Consumer Electronics, vol. 63, no. 3, pp. 258-266, Aug. 2017.
[21] J. Bai, S. Lian, Z. Liu, K. Wang, and D. Liu, “Virtual-blind-road
following-based wearable navigation device for blind people,” IEEE
Transactions on Consumer Electronics, vol. 64, no. 1, pp. 136-143, Feb. Wan-Jung Chang (M’16) obtained a B.S.
2018. degree in electronic engineering from Southern
[22] C.-W. Lee, P. Chondro, S.-J. Ruan, O. Christen, and E. Naroska, Taiwan University of Science and Technology,
“Improving mobility for the visually impaired: a wearable indoor Tainan, Taiwan, R.O.C., in 2000, an M.S. degree in
positioning system based on visual marker,” IEEE Consumer Electronics computer science and information engineering from
National Taipei University of Technology, Taipei,
Magazine, vol. 7, no. 3, pp. 12-20, 2018.
Taiwan, R.O.C., in 2003, and a Ph.D. in electrical
[23] A. Aladrén, G. López-Nicolás, L. Puig, and J. J. Guerrero, “Navigation
engineering from the National Cheng Kung
assistance for the visually impaired using RGB-D sensor with range
University, Tainan, Taiwan, R.O.C., in 2008. He is
expansion,” IEEE Systems Journal, vol. 10, no. 3, pp. 922-932, Sep. currently an Associate Professor in the Department
2016. of Electronic Engineering at Southern Taiwan University of Science and
[24] A. J. Ramadhan, “Wearable smart system for visually impaired people,” Technology, Tainan, Taiwan, R.O.C. He is also the director of Artificial
Sensors, vol. 18, no. 3, article 843, Mar. 2018. Intelligence in the Internet of Things Applied Research Center (AIoT Center)
[25] W. M. Elmannai and K. M. Elleithy, “A highly accurate and reliable data and the Internet of Things Laboratory (IoT Lab) at Southern Taiwan
fusion framework for guiding the visually impaired,” IEEE Access, vol. 6, University of Science and Technology, Tainan, Taiwan, R.O.C. He received
no. 1, pp. 33029-33054, 2018. the Best Paper Award at IEEE ChinaCom 2009, the Best Paper Award at
[26] D. Croce, L. Giarré, F. Pascucci, I. Tinnirello, G. E. Galioto, D. Garlisi, ICCPE 2016, the 1st Prize for the Excellent Demo! Award at IEEE GCCE
and A. L. Valvo, “An indoor and outdoor navigation system for visually 2016–2018, and the 1st Prize for the Best Paper Award at IEEE ICASI 2017.
impaired people,” IEEE Access, vol. 7, no. 1, pp. 170406-170418, 2019. His research interests include cloud/IoT/AIoT systems and applications,
[27] R. Tapu, B. Mocanu, and T. Zaharia, “DEEP-SEE: joint object detection, protocols for heterogeneous networks, and WSN/high-speed network design
tracking and recognition with application to visually impaired and analysis. He is a member of the IEEE.
navigational assistance,” Sensors, vol. 17, no. 11, article 2473, Oct. 2017.
[28] S. Lin, R. Cheng, K. Wang, and K. Yang, “Visual localizer: outdoor
localization based on ConvNet descriptor and global optimization for
visually impaired pedestrians,” Sensors, vol. 18, no. 8, article 2476, Jul. Liang-Bi Chen (S’04–M’10–SM’16) received
2018. B.S. and M.S. degrees in electronic engineering from
[29] K. Yang, K. Wang, L. M. Bergasa, E. Romera, W. Hu, D. Sun, J. Sun, R. the National Kaohsiung University of Applied
Cheng, T. Chen, and E. López, “Unifying terrain awareness for the Sciences, Kaohsiung, Taiwan, in 2001 and 2003,
visually impaired through real-time semantic segmentation,” Sensors, respectively, and a Ph.D. in electronic engineering
vol. 18, no. 5, article 1506, May 2018. from the Southern Taiwan University of Science and
[30] L.-B. Chen, J.-P. Su, M.-C. Chen, W.-J. Chang, C.-H. Yang, and C.-Y. Technology, Tainan, Taiwan, in 2019. From 2004 to
Sie, “An implementation of an intelligent assistance system for visually 2010, he was enrolled in the Computer Science and
impaired/blind people,” in Proceedings of the 2019 IEEE International Engineering Ph.D. Program with National Sun
Conference on Consumer Electronics (ICCE), 2019, pp. 1-2. Yat-Sen University, Kaohsiung. In 2008, he had an
[31] SHARP Electronic Components, “GP2Y0A710K0F: Distance measuring internship with the Department of Computer Science, National University of
sensor unit measuring distance: 100 to 550 cm analog output type,” Singapore, Singapore. He was also a Visiting Researcher with the Department
SHARP GP2Y0A710K0F Datasheet, [Online] Available: URL: of Computer Science, University of California at Irvine, Irvine, CA, USA, from
[Link] 2008 to 2009, and with the Department of Computer Science and Engineering,
0k_e.pdf Waseda University, Tokyo, Japan, in 2010. In 2012, he joined BXB Electronics
[32] Wikipedia, “Triangulation,” [Online] Available: URL: Company Ltd., Kaohsiung, as a Research and Development Engineer, where he
[Link] was an Executive Assistant to the Vice President, from 2013 to 2016. In 2016,
[33] InvenSense, “MPU-6050 six-axis (Gyro + Accelerometer) MEMS he joined the Southern Taiwan University of Science and Technology, as a
MotionTracking™ devices,” [Online] Available: URL: Research Manager and an Adjunct Assistant Professor. He is a member of the
[Link] IEICE and PMI. He received the 1st Prize for the Best Paper Award from the
/. IEEE ICASI 2017, Honorable Mention at IEEE ICAAI 2016, the 1st Prize for

1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/JSEN.2020.2990609, IEEE Sensors
Journal
IEEE SENSORS JOURNAL: Sensors-31523-2020.R1 12

the Excellent Demo! Award from IEEE GCCE 2016–2018, and the 2nd Prize for
the Excellent Poster Award from IEEE GCCE 2016. He was a recipient of the
2014 IEEE Education Society Student Leadership Award, the 2019 Young
Scholar Potential Elite Award of the National Central Library, Taiwan, and the
2018-2019 Publons Peer Review Award. He has also served as a TPC Member,
an IPC Member, and a reviewer for many IEEE/ACM international conferences
and journals. Since 2019, he has served as an Associate Editor for IEEE Access
and a Guest Editor for Energies and Applied Sciences Journals, respectively.

Ming-Che Chen (M’19) received B.S. and M.S.


degrees in computer science from Tunghai
University, Taichung, Taiwan, R.O.C., in 2003 and
2006, respectively, and a Ph.D. from the Institute of
Computer and Communication Engineering (CEE)
of National Cheng Kung University (NCKU),
Tainan, Taiwan, R.O.C., in 2014. He has worked as
an engineer in the research and development of the
industrial Internet of Things (IoT) in the Information
and Communications Research Laboratories (ICRL)
of the Industrial Technology Research Institute (ITRI) in Southern Taiwan
Innovation & Research Park, Ministry of Economic Affairs (MOEA), Taiwan,
R.O.C., since 2014. In May 2018, he joined the Department of Electronic
Engineering at Southern Taiwan University of Science and Technology
(STUST), Tainan, Taiwan, R.O.C., as a Postdoctoral Fellow and an Adjunct
Assistant Professor. His present research interests are in artificial intelligence
applications to Internet of Things (AIoT), industrial Internet of Things (IIoT),
cloud-based application system design, wireless sensor networks, and wireless
networks. He is a member of the IEEE.

Jian-Ping Su received B.S. and M.S. degrees in


electronic engineering from the Southern Taiwan
University of Science and Technology, Tainan,
Taiwan, R.O.C., in 2016 and 2018, respectively. He is
currently pursuing a Ph.D. in electronic engineering
at Southern Taiwan University of Science and
Technology, Tainan, Taiwan, R.O.C. His research
interests include AIoT system design and
applications, wearable devices design, and IoT
platform development.

Cheng-You Sie received a B.S. degree in


electronic engineering from the Southern Taiwan
University of Science and Technology, Tainan,
Taiwan, R.O.C., in 2018. He is currently pursuing an
M.S. degree in electronic engineering at Southern
Taiwan University of Science and Technology,
Tainan, Taiwan, R.O.C. His research interests include
AIoT system design and applications, wearable
device design, and intelligent assistive device
development.

Ching-Hsiang Yang received a B.S. degree in


electronic engineering from the Southern Taiwan
University of Science and Technology, Tainan,
Taiwan, R.O.C., in 2018. He is currently pursuing an
M.S. degree in electronic engineering at Southern
Taiwan University of Science and Technology,
Tainan, Taiwan, R.O.C. His research interests include
AIoT system design and applications, IoT-based
LPWAN wireless network development, and
machine learning, and green energy system
development.

1558-1748 (c) 2020 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See [Link] for more information.
Authorized licensed use limited to: University of Wollongong. Downloaded on July 18,2020 at [Link] UTC from IEEE Xplore. Restrictions apply.

You might also like