VIJAYANAGARA SRI KRISHNADEVARAYA UNIVERSITY
“JnanaSagara”campus, Vinayaka Nagar, Cantonment, Ballari-
583105
DEPARTMENT OF STUDIES IN COMPUTER SCIENCE
Chairman 11.02.2025
Title: Gesture Control Robot
Project Summary:
A gesture control robot is a robotic system that can be controlled using hand or body
gestures. This technology uses sensors and algorithms to detect the interpret human gestures,
allowing users to interact with the robot in a more natural and intuitive way.
Objectives:
[Link] Advanced Gesture Recognition Algorithms:
Create algorithms that can accurately and reliably recognize and interpret human
gestures.
2. Implement Real Time Gesture Control:
Develop a system that can control the robot in real-time using gesture control.
3. Integrate Sensor Systems:
Integrate sensor systems , such as cameras and accelerometers, to detect and track
human gestures.
[Link] User-Friendly Interface:
Create a user-friendly interface that allows users to easily interact with the robot
using gesture control.
Key Components & Budget Breakdown:
Key Components Price
Arduino Lilypad ₹370
Accelerometer ₹380
RF 433 module ₹400
HT12E and HT12D ₹300
Motor driver L293DNE ₹280
BO motor and Wheels ₹400
Prototyping board ₹350
Battery ₹500
Total ₹3000/- Approx.
Methodology & Timeline (15 Days):
Days 1–3: Research and Planning:
• Research existing gesture control and technologies and define project goals and
objectives.
• Brainstorm ideas and create a rough sketch of the robot’s design.
• Research sensors and technologies for gesture recognition and create a
preliminary list of materials and components.
Days 4–6: Design Refining and Sensor Selection:
• Refine the robot’s design using CAD software. Research and select sensors for gesture
recognition.
• Create a detailed list of materials and components. Design the gesture
recognition system and select machine learning algorithms.
• Create a preliminary plan for the robot’s software.
Days 7–9: Software Planning and Microcontroller Selection:
• Refine the robot’s software plan an dselect programming languages and libraries.
• Research and select a microcontroller for the [Link] a detailes plan for the robot’s
electronics and circuit design.
• Design the robot’s user interface and select [Link] and select a power
supply for the robot.
Days 10–12:Detailed Design and Protyping :
• Create a detailed design of the robot’s mechanical and electrical components.
• Develop a prototype of the robot’s gesture recognition system
• Test and refine the gesture recognition [Link] a prototype of the robot’s
electronics and circuit design.
• Test and refine the electronics and circuit design.
Days 13–15:Testing and Debugging:
• Integrate the gesture recognition system and electronics into the [Link] and debug
the robot’s functionality.
• Refine the robot’s performance and accuracy.
• Develop a user manual and documentation for the robot.
• Prepare for final testing and demonstration.
Expected Outcomes:
• Improved gesture recognition accuracy:
The robot can accurately recognize and interpret human gesture, allowing
the seamless control.
• Advanced Sensor Fusion:
The robot can combine data from multiple sensors to improve gesture
recognition accuracy and robustness.