0% found this document useful (0 votes)
48 views8 pages

Implementation of Line Detection Self-Driving Car

This document details the implementation of a self-driving car system utilizing the HSV method for line detection based on a Raspberry Pi 4. The research focuses on designing a navigation path model that employs a camera as a vision sensor to detect street markings, achieving a high level of accuracy compared to traditional line follower methods. The study outlines the hardware components, image processing techniques, and the kinematic model used for the autonomous car's movement along the detected path.

Uploaded by

mcarvaja200
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views8 pages

Implementation of Line Detection Self-Driving Car

This document details the implementation of a self-driving car system utilizing the HSV method for line detection based on a Raspberry Pi 4. The research focuses on designing a navigation path model that employs a camera as a vision sensor to detect street markings, achieving a high level of accuracy compared to traditional line follower methods. The study outlines the hardware components, image processing techniques, and the kinematic model used for the autonomous car's movement along the detected path.

Uploaded by

mcarvaja200
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

JURNAL INFOTEL

Informatics - Telecommunication - Electronics


Website: http://ejournal.st3telkom.ac.id/index.php/infotel
ISSN: 2085-3688; e-ISSN: 2460-0997

Implementation of line detection self-driving car using HSV


method based on raspberry pi 4
Florentinus Budi Setiawan1,* , Eric Pratama Putra1 , Leonardus Heru Pratomo1 , Slamet Riyadi1
1
Departement of Electrical Engineering, Soegijapranata Catholic University
1
Pawiyatan Luhur Street, No. 1, Semarang 50234, Indonesia
*
Corresponding email: [email protected]

Received 7 July 2022, Revised 21 August 2022, Accepted 1 September 2022

Abstract — With the development of technology, especially in industries, daily human activities can be carried out with
artificial intelligence Robots. In this case, self-driving cars have several methods with GPS systems, radar, lidar, or cameras. In
this study, a self-driving car system was designed on a navigation path model using a street mark detector with a camera as a
vision sensor. This self-driving car system is used to move on a path on the detected line. This research uses the HSV method
for line detection. This robot research uses OpenCV, which functions for image processing using dilate, morphologyEx, and
Gaussian Blur processing methods. This research produces a fairly accurate level of path reading when compared to Robots
that use the Line Follower Method. After several processing stages have been passed and the path has been read, this robot
will move the motor following the path detected by the camera. Based on the hardware implementation through testing in the
Autonomous Car laboratory with Computer Vision, it can work according to the algorithm. By using image processing with
the HSV method, the resulting level of precision can reach the average value according to the direction of the self-driving
car.
Keywords – HSV, image processing, raspberry pi 4, self-driving car

Copyright ©2022 JURNAL INFOTEL


All rights reserved.

I. I NTRODUCTION sensor. Vision to detect road signs, so the car continues


to run following the detected pattern [5].
The advantages of application in the field of
technology in the modern era like today make every The main goal of self-driving cars is to solve the
developed country proud in the world, especially functionality problem of self-driving cars, namely the
in terms of scientific progress in transportation [1]. navigation path design model. Planning the navigation
Transportation itself is an important part of human life. path model is the main requirement so that that self-
One of the most widely used private transportation is driving car can run in the right direction [6]. The
the car. Therefore, a technology was developed to drive application of self-driving cars is vision processing,
a car automatically or a self-driving car [2]. A self- and the lane model can be determined based on the
driving car is a vehicle that can move and recognize object of road markings that will be traversed by this
objects around it without human control. This vehicle car. Therefore, street mark detection is the easiest
uses various sensors such as radar, Light Detection and solution to solve the lane model planning problem.
Range (LIDAR), a Global Positioning System (GPS), All information is captured in real-time using camera
odometry, and cameras to detect objects in the vicinity sensors to detect road objects. Therefore, based on
[3]. Sensors generate information about things that previous research on self-driving cars that provided
the control system will process in self-driving cars a model design that combines several sensors [7]
to identify models of navigation paths and obstacles. and research on self-driving cars using the Kinematic
In designing a self-driving car system that provides a model [8], this study designed a self-driving car.
model design that combines multiple sensors [4], this The HSV algorithm system with the addition of
research will develop and manufacture a self-driving a camera as a vision sensor to detect the street
car system on an autonomous car prototype that can mark and a colour quantization feature method using
move along a track automatically using a camera as a thresholding clustering to overcome the shortcomings

321
Jurnal Infotel, Vol. 14, No. 4, November 2022
https://doi.org/10.20895/infotel.v14i4.801
ISSN: 2085-3688; e-ISSN: 2460-0997
Implementation of line detection self-driving car using HSV method based on raspberry pi 4

of this autonomous car prototype [9]. d) Driver Motor L298N


The L298N motor driver is a component or driver
This study designed a self-driving car system using module for DC motors. The L298N driver is a
a prototype autonomic car that can move automatically. speed regulator and can control the direction of
By following road markings detected using the camera rotation of the DC motor [13]. Therefore, the
as a vision sensor and using the Hough transforms L298N motor driver is perfect for robot design
method to adjust the colour in the object so that the applications and is ideal for connecting to a
image taken by the object in real-time can be using a microcontroller or Raspberry Pi 4 Model B.
Raspberry Pi 4 with the HSV filtering method. e) This DC motor driver can produce up to 43 A of
current by functioning as a front-wheel-drive DC
II. R ESEARCH M ETHOD motor. The DC source voltage can be supplied is
between 5.5 V - 27 V DC, while the input voltage
This section discusses the self-driving car systems
level is between 3.3 V - 5 V DC [14]. This
and components, self-driving car hardware design,
motor driver uses a complete H-bridge circuit
HSV method on a self-driving car, real-time object
with BTS7960 IC overheating and overcurrent
detection, and the Ackerman model of a self-driving
protection.
car.
Fig. 1 is a hardware design scheme that will be used.
A. Self-Driving Car Systems and Components
Fig. 1 illustrates the controller hardware scheme
of a self-driving car. In this design, the hardware
components used are Raspberry Pi 4 Model B as a
microcontroller, Raspberry camera as a vision sensor,
L298N motor driver as front-wheel drive, BTS7960
motor driver as rear-wheel drive, Arduino nano as
driver liaison, 12 V ACCU 12 Ah. The device functions
are as follows:

a) Raspberry Pi 4 Model B
The Raspberry Pi is a minicomputer developed in
the UK by Raspberry Pi Foundation. Raspberry
Pi 3 Model B, was released in February 2016.
Broadcom BCM2837 SoC has a 64-bit quad-core Fig. 1. Hardware schematic.
ARM Cortex-A53 processor, running at higher
speed up to 1.2 GHz, 1 GB of RAM, onboard
Wi-Fi, Bluetooth up to 1.2 GHz, and USB boot B. Self-Driving Car Hardware Design
capability [10]. When using the Raspberry Pi The hardware used in this self-driving car is a
4, there are several systems, one of which is a prototype autonomous car with a drive system, namely
system for real-time programming performance a DC motor for speed control. The sensor used in
to achieve stable and not slow results. Several this self-driving car is the Raspberry Pi camera. At
companion components are needed, namely a the same time, the controller used is a Raspberry Pi
microSD and a cooling fan (heat sink). To realize 4 Model B and an Arduino nano which functions
the scope of objects to be detected effectively and as a program. As for the motor driver, L298N and
not too slowly. BTS7960 drivers are used, which function as DC motor
b) Pi Camera regulators. Fig. 2 is a prototype of an autonomous car.
The Pi camera is a module that sends data
to the Raspberry Pi 4 using an elastic cable
Self-driving is designed to detect the location of
to support sending programs. In addition, the
road marking objects through the results of the image
camera module also has a high-speed connection
processing taken by the Raspberry camera. The camera
capable of sending images and recording videos
is operated as an image receiver by the Raspberry Pi
with high resolution [11].
4 Model B minicomputer.
c) Arduino Nano
Arduino Nano [12] is a small, easy-to-use board The coordinates processing results of detecting road
based on the ATmega328P architecture with an markings from the image are sent to the Raspberry
ATmega328 microcontroller with an operating Pi 4 through communication between the camera
voltage of 5 V. It uses 32 KB of flash memory, and the Raspberry Pi 4. In addition, detecting object
of which the bootloader uses 2 KB. The Arduino result of the street mark are forwarded to run the
Nano processor clock speed is 16 MHz, which dc motor as the steering wheel and the movement
serves to send the program to the BTS7960 driver. of the self-driving car [15]. The image processing

322
Jurnal Infotel, Vol. 14, No. 4, November 2022
https://doi.org/10.20895/infotel.v14i4.801
ISSN: 2085-3688; e-ISSN: 2460-0997
Implementation of line detection self-driving car using HSV method based on raspberry pi 4

Saturation describes the intensity of the colour that


includes values 0 to 255. If the saturation value is less,
it will show a grey colour. The deal consists of values
0 to 255, with 0 being the dark colour and the value
255 being the red colour. The range of the HSV colour
space used in this study is shown in Fig. 4.

Fig. 2. Prototype autonomous car.

Fig. 4. The range of the HSV colour.


process for detecting road signs and the working
system relationship between hardware, can be seen in
figure below. The HSV space method has the following formula,
with the notation shown in Table 1.
Table 1. Notations of the HSV Space Method
Notation Definition
0
R Red value
0
G Green value
0
B Blue value
Cmax Maximum contrast value
Cmin Minimum contrast value
H Hue
S Saturation
V Value
4 Determination of Cmax and Cmin

R0 G0 B0
R0 = , G0 = , B0 = (1)
255 255 255

Eq. (1) shows the RGB colour composition or the


initial input of the colour combination to be used
Fig. 3. Flowchart system of self-driving car.
Cmax = max(R0 , G0 , B 0 ) (2)
The prototype autonomous car in this design uses Cmin = min(R0 , G0 , B 0 ) (3)
a 12 V, 12 Ah battery, which a voltage source used to
supply power to the Raspberry Pi 4 and DC motor. Eqs. (2) and (3) are the determination of the maximum
contrast and minimum contrast in the RGB colour
C. HSV Method on Self-Driving Car composition used.
HSV (Hue, Saturation, Value) is one of the methods
4 = Cmax − Cmin (4)
used to apply this self-driving car. HSV functions in
colour selection because the HSV method approaches Eq. (4) is the result between maximum contrast and
the visual way of the human eye in distinguishing minimal contrast. Meanwhile, (5) is an equation that
colours. HSV is the result of the transformation of the produces hue values using the R, G, and B colour
RGB colour form (Red, Green, Blue), and the light values, Minimum Contrast, and Maximum Contrast.
intensity or colour brightness value varies from 0 to
Thus, Hue’s calculation becomes:
100 [16]. 
00 ; 4 = 0,
The HSV colour method is beneficial in detecting

  0 0 

−B
600 × G 4 ; Cmax = R0 ,

street marks by segmenting based on the lowest colour
 mod 6
range to the highest colour on street mark objects. HSV H=  0 0
−R

 600 × B 4 +2 ; Cmax = G0 ,
has three main characteristics, namely Hue, Saturation,

  
600 × B 0 −G0 + 4

; Cmax = B 0

and Value which have different values. The Hue value 4
is changed from 0 to 255, with the value 0 being red. (5)

323
Jurnal Infotel, Vol. 14, No. 4, November 2022
https://doi.org/10.20895/infotel.v14i4.801
ISSN: 2085-3688; e-ISSN: 2460-0997
Implementation of line detection self-driving car using HSV method based on raspberry pi 4

Eq. (5) produces hue values using the R, G, and HSV colour value used as input in measuring road
B colour values, Minimum Contrast, and Maximum marking objects in real-time video testing [17]. Fig. 6
Contrast. is a Flowchart of object detection in real-time.
The Saturation calculation is shown in (6):
(
0 ; Cmax = 0,
f (x; λ, k) = (6)
0 ; Cmax 6= 0
(6) is the value used to produce the saturation value
by using subtraction and division of the minimum and
maximum contrast values
And the Value calculation
Cmax
ContrastV alue = (7)
Cmin
(7) shows the value by determining the maximum or
minimum contrast value.
Fig. 6. Real-time detection flowchart.
If the object’s colour composition is within the
specified HSV value range, the object will be white as
assumed to match the range value. Meanwhile, items The flowchart above shows how computer vision
with a colour composition outside the range value will works. First, the program runs, then the camera for
change colour to black. Fig. 5 is the threshold process detecting road markings will be active and display a
for changing the HSV colour range. video stream for detecting roads that will be traversed
by autonomous cars.
E. Ackerman Model on a Self-Driving Car
The Autonomous Car prototype assumes the
kinematic model that there is no slip angle between
the front and rear wheels are the same therefore,
the kinematic model applies to low-speed applications
[18]. The front and rear wheels are connected by
a metal frame that allows the autonomous car to
move forward and backwards simultaneously [19],
[20]. Therefore, in this autonomous car turning motion,
it will be possible to form an angle or coordinate point.
The following is the equation of the kinematic model
of the autonomous car, where the notation is shown in
Table 2.
Table 2. Notations of the Kinematic Model
Notation Definition
Ẋ X coordinate
Ẏ Y coordinate
vb Rear wheels
vf Front wheels
la , l b Length from the center of mass AGV to axis P
lf Length from AGV axis to point Q
β Slip angle of AGV
dt Change over time
ψ̇ Change in angle at the center of the axis
γ Gamma angle Y
Fig. 5. Thresholding process based on HSV color range.

The image above shows the thresholding process


in maintaining the original object image with the Ẋ = V cos (ψ + β) (8)
modified image using the HSV color range method.
Ẏ = V sin (ψ + β) (9)
D. Real-Time Object Detection
Video-based camera sensors are used in real-time to tan δf
ψ̇ = V cos (β) (10)
detect road signs. Real-time detection will be processed L
if the frame meets the criteria for the appropriate range Here, β(πrad) represents the slip angle of the AGV,
value at calibration time. For example, in self-driving the direction of the AGV is determined by the axis
car research, calibration is needed to determine the determined by the gamma angle γ. The initirear wheel

324
Jurnal Infotel, Vol. 14, No. 4, November 2022
https://doi.org/10.20895/infotel.v14i4.801
ISSN: 2085-3688; e-ISSN: 2460-0997
Implementation of line detection self-driving car using HSV method based on raspberry pi 4

determines the initial speed of the autonomous car. measurement results obtained the average speed value
Speed, the slip β angle value and the linear speed of the in units of RPM (revolutions per minute). The results
autonomous car are written as the following equation. of the rotation speed measurement can be seen in

lf tan γ
 Table 3.
−1
β = tan (11) Table 3. Rotation Speed
la + lb + lf
No. PWM duty Rotation Autonomous Car
vb cos (γ) + vb cycle (%) Speed (RPM) Speed Condition
V = (12) 1 30 125.5 Slow
2 cos (β) 2 40 210.8 Slow
The AGV position is then determined by applying the 3 50 274.8 Moderate
4 60 293.5 Moderate
integration in (8) – (10) for time t(s). 5 70 418.9 Fast
Z t
X= V cos (ψ + β)dt (13)
0 It can be seen in Table 3 that the speed test is
Z t measured at a duty cycle of 30% to 70%. At a duty
Y = V sin (ψ + β)dt (14) cycle of 30% to 40%, the motor in an autonomous car
0
Z t
will run slowly in the range of 125 rpm to 210 rpm.
Z= V cos (ψ + β)dt (15) 50% to 60% duty cycle, the autonomous car motor will
0 rotate 274 rpm to 293 rpm. While 70% duty cycle, the
Fig. 7 shows the Ackerman frame model of the autonomous car motor will move up 418 rpm. If the
self-driving car. duty cycle increases high, the motor will rotate quite
fast according to the duty cycle used. It can be seen
the wheels rotational speed when the autonomous car
is running on the track. This test influences how we
set up its PWM duty cycle.
For testing, the autonomous car will be strengthened
by the data from the output signal. These results
are obtained through the image of the oscilloscope
measurement results. The image from the oscilloscope
will later show the results of the PWM signal
measurement, as shown in Table 3.
Based on the test results, it can be analyzed that
Fig. 7. Frame Ackerman autonomous car.
the autonomous car can turn at a maximum angle. The
PWM value determines whether an autonomous car
can run slow or fast. The first measurement result with
a PWM duty cycle of 30%, shown in Fig. 8 (a), is
III. R ESULT AND D ISCUSSION the result of the signal on the oscilloscope. Fig. 8 (b)
shows the resulting speed of 125.5 RPM. The second
This test is carried out to determine the function
measurement result with a PWM duty cycle of 70%,
of the prototype autonomous car, which has an
shown in Fig. 9 (a), is the result of the signal on the
explanation of the stability and ability of the self-
oscilloscope. Fig. 9 (b) shows the resulting speed of
driving car system to detect street marks. That way, we
418.5 RPM.
can see how fast the real-time system detects street-
mark objects. With a real-time system, self-driving B. HSV Color test on Self-Driving Car
cars can calculate the detection distance based on the HSV testing is carried out to check the processing
detection area. of street mark objects on a predetermined path. In
A. Self-Driving Car Speed Test Based on Turning this test, the thresholding method is used, namely
Angle measuring the maximum and minimum limits. In
addition, this test is carried out to distinguish the white
To determine the ability of the prototype
and black colours found in the HSV method, which
autonomous car to move according to a predetermined
have a maximum or minimum value.
path. So it is necessary to test the speed and turning
angle. This test aims to determine the rate of the To find out the maximum and minimum limits of
autonomous car linearly. The linear speed itself is the HSV method, we do a test by distinguishing the
influenced by the rotation of the DC motor used filter from black objects and white objects. This test
in the prototype autonomous car. So test is carried is done by activating the program contained in the
out with a tachometer using the robot running on Raspberry Pi 4 Model B. Fig. 10, Fig. 11, Fig. 12
a predetermined path, and the rotational speed of and Table 4, Table 5, Table 6, represent the testing of
the wheels is calculated using a tachometer. The the HSV method.

325
Jurnal Infotel, Vol. 14, No. 4, November 2022
https://doi.org/10.20895/infotel.v14i4.801
ISSN: 2085-3688; e-ISSN: 2460-0997
Implementation of line detection self-driving car using HSV method based on raspberry pi 4

Fig. 10. Hue threshold test.

Table 4. Hue Threshold Test


Hue Min Hue Max White Object Black Object
155 179 Detected Not Detected

Fig. 8. (a) Signal duty cycle PWM 30%, (b) Acceleration output
125.5 RPM.

Fig. 11. Saturation threshold test.

Table 5. Saturation Threshold Test


Saturation Saturation White Object Black Object
Min Max
54 255 Detected Not Detected

Fig. 9. (a) Signal duty cycle PWM 70%, (b) Acceleration output
418.5 RPM. Fig. 12. Value threshold test.

326
Jurnal Infotel, Vol. 14, No. 4, November 2022
https://doi.org/10.20895/infotel.v14i4.801
ISSN: 2085-3688; e-ISSN: 2460-0997
Implementation of line detection self-driving car using HSV method based on raspberry pi 4

Table 6. Value Threshold Test.


Value Min Value Max White Object Black Object used from this calibration result has a minimum and
47 95 Detected Not Detected maximum value range that makes this HSV method
able to process a road mark object. From this test, the
resulting value range is hue min = 155, hue max = 179,
Based on Fig. 10, Fig. 11, and Fig. 12, the saturation min = 54, saturation max = 255, and value
thresholding calibration value is determined by finding min = 47, value max = 95, with the result that HSV
the minimum and maximum values of the HSV (Hue, detects street marks well. In this process, the Raspberry
Saturation, Value) method, which aims to detect white Pi processes Yahoo by sending data to the Raspberry
and black objects under street mark object detection. Camera. So that the Real-Time function on the visual
image is active. Which makes the BTS7960 driver
C. Final Result of Street Mark Detection with HSV and L298N driver run correctly—using a DC motor
Method to adjust the PWM duty cycle to the turning angle
on the self-driving car track to determine the speed
In this last test, we can see that the autonomous and torque of a DC motor when an autonomous car is
car moves following the street mark object that has running. The maximum value of the turning angle is
been processed based on the HSV method in real-time proportional to the speed of the autonomous car. The
so that tracking appears on every path the camera has faster the autonomous car runs, the smaller the turning
detected. The emergence of this tracking path makes angle and vice versa. A program that has been made
the autonomous car run stably, following the path that and researched is more accurate and has almost no
has been seen. In addition, the tracking line functions street mark error. In the future, this research will be
to determine a turning angle for the autonomous car so used for industrial needs, which will then be applied
that the self-driving car system can function properly to Autonomous cars in the industry.
according to the algorithm that has been created.
ACKNOWLEDGMENT
The author would like to thank Soegijapranata
Catholic University for supporting facilities and places
to conduct research in the laboratory. Hopefully, this
tool can be helpful for technological progress in
the world. Also, the author would like to thank the
University President for his support.

Fig. 13. Self-driving car test. R EFERENCES


[1] M. V. G. Aziz, A. S. Prihatmanto, and H. Hindersah,
”Implementation of lane detection algorithm for self-
Based on Fig. 13, the test has been carried out driving car on toll road cipularang using Python language,”
indoors and outdoors. It is better to test this self- in 2017 4th International Conference on Electric
Vehicular Technology (ICEVT), 2017, pp. 144–148. doi:
driving car indoors to get maximum results, based 10.1109/ICEVT.2017.8323550.
on tests carried out indoors, there is no interference
[2] Z. M. U. Din, W. Razzaq, U. Arif, W. Ahmad, and W.
with the intensity of sunlight. On the other hand, in Muhammad, ”Real time Ackerman steering angle control
outdoor testing, the sun’s power is very high, so the for self-driving car autonomous navigation,” in 2019 4th
performance of the HSV method is not optimal. After International Conference on Emerging Trends in Engineering,
Sciences and Technology (ICEEST), 2019, pp. 1–4. doi:
the text edit has been completed, the paper is ready 10.1109/ICEEST48626.2019.8981710.
for the template. Duplicate the template file by using
[3] R. Run and G. Sun, ”The cost-effective GPS guided
the Save As command, and use the naming convention autonomous vehicle - a feasibility study based on self-
prescribed by your conference for the name of your balancing scooter,” in 2017 International Conference on
paper. In this newly created file, highlight all of the Applied System Innovation (ICASI), 2017, pp. 778–780. doi:
10.1109/ICASI.2017.7988546.
contents and import your prepared text file. You are
now ready to style your paper; use the scroll down [4] H. Cho, Y.-W. Seo, B. V. K. V. Kumar, and R. R. Rajkumar,
”A multi-sensor fusion system for moving object detection
window on the left of the MS Word Formatting toolbar. and tracking in urban driving environments,” in 2014 IEEE
International Conference on Robotics and Automation (ICRA),
2014, pp. 1836–1843. doi: 10.1109/ICRA.2014.6907100.
IV. C ONCLUSION
[5] S. N. K. Rangan, V. G. Yalla, D. Bacchet, and I.
Based on the testing and analysis that has been Domi, ”Improved localization using visual features and
carried out. The autonomous car prototype has been maps for autonomous cars,” in 2018 IEEE Intelligent
successfully designed. According to the specifications Vehicles Symposium (IV), 2018, pp. 623–629. doi:
10.1109/IVS.2018.8500540.
of the self-driving car system with vision image
processing performance using the HSV method as a [6] J. Newman, Z. Sun, and D.-J. Lee, ”Self-driving cars: a
platform for learning and research,” in 2020 Intermountain
street mark object detector, thus making autonomous Engineering, Technology and Computing (IETC), 2020, pp. 1–
cars run following street mark objects. The threshold 5. doi: 10.1109/IETC47856.2020.9249142.

327
Jurnal Infotel, Vol. 14, No. 4, November 2022
https://doi.org/10.20895/infotel.v14i4.801
ISSN: 2085-3688; e-ISSN: 2460-0997
Implementation of line detection self-driving car using HSV method based on raspberry pi 4

[7] A. A. Mahersatillah, Z. Zainuddin, and Y. Yusran, Rekayasa Elektr., vol. 17, no. 1, pp. 7–14, 2021. doi:
”Unstructured road detection and steering assist based 10.17529/jre.v17i1.18087.
on HSV color space segmentation for autonomous car,” in
2020 3rd International Seminar on Research of Information [20] F. B. Setiawan, Y. Yovie, C. Wibowo, L. H. Pratomo, and S.
Technology and Intelligent Systems (ISRITI), 2020, pp. Riyadi, “Perancangan automated guided vehicle menggunakan
688–693. doi: 10.1109/ISRITI51436.2020.9315452. penggerak motor DC dan motor servo berbasis raspberry pi
4,” J. Rekayasa Elektr., vol. 18, no. 2, pp. 94–101, 2022. doi:
[8] Y. Chen, S. Chen, T. Zhang, S. Zhang, and N. Zheng, 10.17529/jre.v18i2.25863.
”Autonomous vehicle testing and validation platform:
integrated simulation system with hardware in the loop,” in
2018 IEEE Intelligent Vehicles Symposium (IV), 2018, pp.
949–956. doi: 10.1109/IVS.2018.8500461.

[9] E. Oruklu, D. Pesty, J. Neveux, and J. Guebey, ”Real-time


traffic sign detection and recognition for in-car driver assistance
systems,” in 2012 IEEE 55th International Midwest Symposium
on Circuits and Systems (MWSCAS), 2012, pp. 976–979. doi:
10.1109/MWSCAS.2012.6292185.

[10] K. B. Swain, S. Dash, and S. S. Gouda, ”Raspberry pi


based integrated autonomous vehicle using LabVIEW,” in
2017 Third International Conference on Sensing, Signal
Processing and Security (ICSSS), 2017, pp. 69–73. doi:
10.1109/SSPS.2017.8071567.

[11] T. Sorwar, S. B. Azad, S. R. Hussain, and A. I. Mahmood,


”Real-time Vehicle monitoring for traffic surveillance and
adaptive change detection using raspberry pi camera
module,” in 2017 IEEE Region 10 Humanitarian Technology
Conference (R10-HTC), 2017, pp. 481–484. doi: 10.1109/R10-
HTC.2017.8289003.

[12] A. R. Hutauruk, J. Pardede, P. Aritonang, R. F. Saragih,


and A. Sagala, ”Implementation of wireless sensor
network as fire detector using arduino nano,” in 2019
International Conference of Computer Science and
Information Technology (ICoSNIKOM), 2019, pp. 1–4.
doi: 10.1109/ICoSNIKOM48755.2019.9111537.

[13] L. Chen, J. Zhang, and Y. Wang, ”Wireless car control


system based on arduino uno R3,” in 2018 2nd IEEE
Advanced Information Management,Communicates,Electronic
and Automation Control Conference (IMCEC), 2018, pp.
1783–1787. doi: 10.1109/IMCEC.2018.8469286.

[14] P. Bhowmik, P. K. Rout, J. M. Guerrero, and A. Abusorrah,


”Vector measurement-based virtual inertia emulation technique
for real-time transient frequency regulation in microgrids,”
IEEE Transactions on Power Electronics, vol. 36, no. 6, pp.
6685–6698, 2021. doi: 10.1109/TPEL.2020.3034699.

[15] S. Chaudhari, P. Shendge, and S. Phadke, ”Disturbance


observer based model following controller for
four wheel steering vehicles,” in 2019 Second
International Conference on Advanced Computational
and Communication Paradigms (ICACCP), 2019, pp. 1–6.
doi: 10.1109/ICACCP.2019.8883014.

[16] A. Aqthobilrobbany, A. N. Handayani, D. Lestari,


Muladi, R. A. Asmara and O. Fukuda, ”HSV Based
Robot Boat Navigation System,” 2020 International
Conference on Computer Engineering, Network, and
Intelligent Multimedia (CENIM), 2020, pp. 269–273. doi:
10.1109/CENIM51130.2020.9297915.

[17] S. Kolkur, D. Kalbande, P. Shimpi, C. Bapat, and J. Jatakia,


“Human skin detection using RGB, HSV, and YCbCr color
models,” in Advances in Intelligent Systems Research, vol. 137,
pp. 324–332, 2017. doi: 10.2991/iccasp-16.2017.51.

[18] Y. Lin, H. Zhao, C. Ye, and H. Ding, ”A computationally


efficient and robust kinematic calibration model for
industrial robots with kinematic parallelogram,” in
2017 IEEE International Conference on Robotics
and Biomimetics (ROBIO), 2017, pp. 1334–1339. doi:
10.1109/ROBIO.2017.8324602.

[19] F. B. Setiawan, O. J. Aldo Wijaya, L. H. Pratomo, and S.


Riyadi, “Sistem navigasi automated guided vehicle berbasis
computer vision dan implementasi pada raspberry pi,” J.

328
Jurnal Infotel, Vol. 14, No. 4, November 2022
https://doi.org/10.20895/infotel.v14i4.801

You might also like