0% found this document useful (0 votes)
130 views7 pages

Drone Project Report

The project integrates a Raspberry Pi 5 with a Pixhawk 2.4.8 flight controller for real-time object detection using the YOLOv8 nano model and MAVLink protocol for drone control. Key components include hardware setup, software libraries, and a unified script (`run_drone.py`) for operation, with completed steps and error resolutions documented. Future tasks involve optimizing the model, enhancing MAVLink commands, and conducting structured testing for performance evaluation.

Uploaded by

anuescapist
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
130 views7 pages

Drone Project Report

The project integrates a Raspberry Pi 5 with a Pixhawk 2.4.8 flight controller for real-time object detection using the YOLOv8 nano model and MAVLink protocol for drone control. Key components include hardware setup, software libraries, and a unified script (`run_drone.py`) for operation, with completed steps and error resolutions documented. Future tasks involve optimizing the model, enhancing MAVLink commands, and conducting structured testing for performance evaluation.

Uploaded by

anuescapist
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Drone Object Detection & MAVLink

Integration Project Report


1. Project Overview
This project integrates a Raspberry Pi 5 with a Pixhawk 2.4.8 flight controller to enable
object detection and drone control using the YOLOv8 nano model. The Raspberry Pi handles
real-time computer vision tasks while communicating with the Pixhawk to guide drone
movement based on detection results using MAVLink protocol. Both object detection and
MAVLink communication are unified in `run_drone.py`.

2. Hardware and Software Setup

Hardware Components:

- Raspberry Pi 5
- Pixhawk 2.4.8 Flight Controller
- USB Camera
- UART Cable or USB for Pixhawk Communication
- Drone Frame with Motors and ESCs
- Power Supply (5V, 3A)

Software Components:

- Raspberry Pi OS (Bookworm 64-bit)


- Python 3.9+
- OpenCV and Picamera2 Libraries
- Ultralytics YOLOv8 (nano model)
- PyMAVLink (for Pixhawk communication)
- MAVProxy (for debugging and telemetry)
- QGroundControl (for Pixhawk configuration)

3. Completed Steps

✔ Raspberry Pi OS installed and updated.


✔ Camera interface enabled and tested.
✔ YOLOv8 environment setup with ultralytics and required libraries.
✔ Pixhawk successfully connected via USB to Raspberry Pi.
✔ MAVProxy installed and configured for telemetry.
✔ PyMAVLink installed and tested with heartbeat connection.
✔ Object detection code with YOLOv8 running successfully.
✔ Detected objects visualized with bounding boxes.
✔ Combined script `run_drone.py` created for unified operation.

4. Errors and Fixes

1. MAVProxy connection failed: Resolved by using USB connection instead of UART.


2. Serial port access denied: Fixed by enabling hardware serial via `raspi-config` and
adjusting permissions.
3. Camera not detected: Resolved by confirming `picamera2` and OpenCV installation and
checking `/dev/video*` devices.
4. YOLOv8 import error: Solved by using a clean virtual environment and installing with
`pip install ultralytics`.

5. Project Folder Structure


Corrected project structure from Raspberry Pi home directory:

/home/pi/
├── project_drone/
│ ├── mavenv/ # MAVProxy environment
│ ├── object_detection_drone/ # YOLOv8 & object detection logic
│ ├── run_drone.py # Combined object detection + MAVLink script
│ └── [Link] # YOLOv8 model file
6. Commands Executed

# Raspberry Pi setup and updates


sudo apt update && sudo apt upgrade -y

# Enable camera and UART


sudo raspi-config

# Install required Python libraries


sudo apt install python3 python3-pip python3-venv python3-opencv python3-picamera2 -y

# YOLOv8 environment setup


python3 -m venv yolov8_env
source yolov8_env/bin/activate
pip install ultralytics opencv-python pymavlink

# Pixhawk communication test via PyMAVLink


python3
>>> from pymavlink import mavutil
>>> master = mavutil.mavlink_connection('/dev/serial0', baud=57600)
>>> master.wait_heartbeat()

# Running the integrated drone script


python run_drone.py

# Running MAVProxy for telemetry (in another terminal)


cd ~/mavenv
[Link] --master=/dev/serial0 --baudrate 57600 --out=udp:[Link]:14550

🔴 Remaining Tasks (To Be Completed)


🔴 TECHNICAL ENHANCEMENTS

• OPTIMIZE YOLO MODEL USING TENSORRT OR ONNX FOR FASTER INFERENCE (AIM FOR
<100MS LATENCY).

• ADD MAVLINK COMMAND LOGIC (MAV_CMD_DO_SET_MODE) IN THE SCRIPT TO CHANGE


FLIGHT MODE UPON DETECTION.

• EXPORT THE YOLO MODEL IN .ONNX FORMAT AND VALIDATE PERFORMANCE ON RASPBERRY PI
5.

🔴 TESTING AND EVALUATION

• DESIGN A STRUCTURED TEST PLAN WITH DEFINED METRICS SUCH AS:

o DETECTION LATENCY < 500MS

o ACCURACY > 85%

o LIVE INFERENCE FPS > 10

• PERFORM OUTDOOR DRONE TESTS IN A CONTROLLED ENVIRONMENT.

• LOG AND ANALYZE MAVLINK TELEMETRY WITH OBJECT DETECTION TIMESTAMPS FOR REVIEW.

❗ INTEGRATION TASKS

• FULLY INTEGRATE YOLO OBJECT DETECTION WITH DRONE CONTROL LOGIC (VIA PYMAVLINK).

• DEPLOY AND TEST A CUSTOM-TRAINED YOLOV8 MODEL (OPTIONAL BUT RECOMMENDED).

• ENSURE CAMERA FEED IS PROCESSED IN REAL-TIME WITH BOUNDING BOX OUTPUT GUIDING
FLIGHT DECISIONS.

❗ HARDWARE & SAFETY

• SECURE POWER MANAGEMENT AND THERMAL CONTROL FOR THE RASPBERRY PI


(HEATSINK/FAN RECOMMENDED).

• FINALIZE EMERGENCY HANDLING LOGIC (E.G., RETURN-TO-LAUNCH, HOLD MODE, LOW-POWER


SHUTDOWN).
8. UART/USB Note with YOLOv8n

Even though YOLOv8n handles object detection, it does not control the drone directly. To
send movement or telemetry commands, we use MAVLink protocol over UART or USB to
communicate with Pixhawk. YOLOv8n processes images, and PyMAVLink or MAVProxy
bridges the detection with drone control using USB/UART.

9. to test the project


pi@Raspberry Pi:~cd project_drone

python3 run_drone.py

Concepts Learned in Drone Object


Detection & MAVLink Integration
Project
This project helped me gain practical knowledge by integrating computer vision and flight
control systems. I worked with Raspberry Pi 5 and Pixhawk 2.4.8, using UART/USB
communication, Python scripting, and YOLOv8-based real-time object detection. Here is a
detailed explanation of the major concepts I learned during this project:

1. MAVLink Communication Protocol


MAVLink is a lightweight communication protocol used for communication between drones
and ground control systems. I learned how to connect a Raspberry Pi to a Pixhawk 2.4.8
flight controller using UART as well as USB for telemetry. Using PyMAVLink and MAVProxy,
I could send commands, receive telemetry, and change flight modes. This helped me
understand the internal messaging system that powers autonomous drones.

2. Headless Raspberry Pi Setup & UART Configuration


I configured Raspberry Pi in headless mode (without monitor) to control it remotely. I
enabled and tested UART communication through GPIO14 and GPIO15, and understood the
role of serial devices in Linux. I also learned how to resolve port conflicts by disabling
interfering services like ModemManager.
3. Real-Time Object Detection using YOLOv8
YOLOv8 Nano was used for real-time object detection on Raspberry Pi. I learned how to
install and use the 'ultralytics' library, configure the camera, and run detection using
OpenCV. The model could detect objects and draw bounding boxes on live video feed, which
was useful for decision-making in drone movement.

4. Sensor Fusion: Integrating Vision with Flight Logic


I combined object detection with flight logic using the 'run_drone.py' script. Based on
detected objects, commands were sent to Pixhawk via MAVLink to control the drone. This
showed me how two independent systems (vision and flight control) can be fused to make
intelligent decisions.

5. Virtual Environments and Dependency Management


I created and managed Python virtual environments for both object detection and MAVLink
separately. This helped keep dependencies isolated and avoid version conflicts. I learned
how to activate environments, install libraries, and structure directories for better
development workflow.

6. Error Handling and Debugging


Throughout the project, I encountered and resolved several errors such as 'waiting for
heartbeat', port permission issues, 'PyQt platform plugin' warnings, and camera detection
failures. I learned how to read logs, check system devices like '/dev/ttyAMA0', and make
use of tools like MAVProxy and VNC for debugging.

7. End-to-End System Integration


Most importantly, I learned how to build a fully functional, real-time drone vision system
from scratch. This included hardware setup, OS configuration, camera interfacing, script
development, and integration testing. It was a great experience in real-world embedded
system development and AI application on edge devices.

You might also like