0% found this document useful (0 votes)
35 views23 pages

EPR Lec8 07062025 ComputationalModels

Uploaded by

pinjalan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views23 pages

EPR Lec8 07062025 ComputationalModels

Uploaded by

pinjalan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 23

Computational

models and methods


for processing
sensor information

07/06/2025

Biswarup Mukherjee
Assistant Professor
Centre for Biomedical Engineering, IIT
Delhi

Executive Programme in Robotics (Batch -2)


mating position and orientation in a robot - Odometry
Differential drive robot

• Note positions
are relative

• Absolute position
calculation is
additive

• Accumulates
Particle filter
Imagine that a robot is in a room
• The robot knows the general layout of the room.
• Robot does not know where it is in the room.
• Robot needs to determine its POSE in the room.
(POSE = Position + orientation)
• Sensors include – ENCODERS + LIDAR

What the world is really like How the robot knows the world
(binary occupancy grid)

Initial location
Particle filter
• Sensors include – ENCODERS + LIDAR

What the sensors are telling the robot Possible locations

NOISE
+
Odometry for dead-reckoning
Orientation Relative movement
• Estimate how much
the robot has moved
and in which
direction.

• Errors add up over


large distances.
Particle filter
NOISY ODOMETRY + NOISY LIDAR
Imagine yourself in a dark room with a torch. You know the general layout of the
room. How will you estimate your current position?

We can be anywhere? Sense Probable locations Improbable locations Map

Move and use memory

• Keep moving and measuring the movement.


• Keep scanning the room.

• The movement will result in a partial map creation


• Partial map + movement can be used to estimate the current positi
Particle filter
Imagine yourself in a dark room with a torch. You know the general layout of the
room. How will you estimate your current position?

Seed multiple Eliminate wrong guesses


locations and and emphasize probable
True pose Sense
orientations at locations
random

Next iteration more


particles in the high All particles move in
probability areas Sense + Move (odometry) the same manner
+noise
Monte-Carlo Localization

NOTE: New seeds are in


random orientation and
Particle filter
Next iteration more
particles in the high All particles move in
probability areas Sense + Move (odometry) the same manner
+noise
Monte-Carlo Localization

NOTE: New seeds are in


random orientation and
location
New high probability New particle seeding
zones

• Now we are closer to the real position as there are only two high probability areas
• Reduce number of seeds (particles) in each step – Adaptive Monte Carlo Localization (AMCL)
• CAUTION: Do not underseed or the position will be error prone.
ticle filter – putting it all together
Step 1: Create uniform random seeds on the MAP
Step 2: Make lidar measurement
Step 3: Eliminate the seeds that does not match measurements
Step 4: Move a predetermined amount
Step 5: Reseed particle in high probability areas

In Step 4, how much will the robot move?


How will the robot generate the map?
multaneous Localization and Mapping
SLAM (simultaneous localization and mapping) is a method used for mobile robots:
• That builds a map and localizes the robot in that map at the same time.
• SLAM algorithms allow the vehicle to map out unknown environments.
• The map information can be used to carry out tasks such as path planning and obstacle
avoidance.

Without SLAM,
• Robot will just move randomly within a room
• May not be able to clean the entire floor surface.
• In addition, this approach uses excessive power, so the battery will run
out more quickly.
multaneous Localization and Mapping
SLAM (simultaneous localization and mapping) is a method used for mobile robots:
• That builds a map and localizes the robot in that map at the same time.
• SLAM algorithms allow the vehicle to map out unknown environments.
• The map information can be used to carry out tasks such as path planning and obstacle
avoidance.
With SLAM:

• Information such as the number of wheel revolutions + data


from cameras and other imaging sensors is used determine
the amount of movement needed. This is called localization.
• The robot can also simultaneously use the camera and other
sensors to create a map of the obstacles in its surroundings
and avoid cleaning the same area twice. This is called
mapping.
Visual SLAM – Camera + TOF sensor LIDAR SLAM – LIDAR + odometry

https://www.cnet.com/home/kitchen-and-household/this-is-why-your-roombas-random-patterns-actually-make-perfect-sense/
multaneous Localization and Mapping
SLAM (simultaneous localization and mapping) is a two stage process:
• That builds a map and localizes the robot in that map at the same time.
• SLAM algorithms allow the vehicle to map out unknown environments.
• The map information can be used to carry out tasks such as path planning and obstacle
avoidance.

Visual SLAM – Camera + TOF sensor LIDAR SLAM – LIDAR + odometry


ose graph estimation
A robot in a space
NO MAP
LIDAR + ODOMETRY Sense + Move (odometry)

Ideal case

No NOISE

MAP after completion


Sense + Move (odometry)

LIDAR is accurate
ODOMETRY is NOISY

Real and measured maps vastly different due to sensor noise


ose graph estimation Eureka!!!!
ath denoted as a POSE GRAPH with Nodes + Edges This looks familiar
• Nodes and Edges are elastic

CONNECT THE POSE GRAPHMerge the two measurementsOptimize the graph

LOOP CLOSURES are essential to update the


graph and provide an overall improvement
to the entire graph
ose graph estimation FINAL OUTPUT:

eep adding more loop closures Pose graph and point cloud Probabilistic Occupancy Grid (MAP)
updated

CAUTION: An incorrect loop closure can skew the +


entire map Path followed
Map is skewed
This is an incorrect association since nodes are
now constrained
and merged
ath/Motion planning
How does the robot get from the Start to the Goal in the
Occupancy grid
Search-based path planning Sampling –based path planning
A* RRT

GOAL pose

Starting pose
2D coodinates
(X, Y)
Orientation

Motion planning estimates the states at every point and its derivatives (Velocity, Acceleratio
earch-based path planning
Starting map Move in a random direction and add the cost

But, this may not be the best or optimal path


Repeat with other paths Keep walking randomly till the graph produces a satisfactory

ing randomly is not an efficient solution – Can we add additional constr


rch-based path planning – Understanding A*
Divide the space into a grid For every point,
and assign costs to every Calculate straight line distance to
Shakey move goal
First general-purpose
mobile robot

Choose path that has the lowest total cost:


Repeat with other paths Move in the general direction of the target till goal is reached
rch-based path planning – Understanding A*
What are the disadvantages? Rapid exploring random tree (RRT*)
More than 2 dimensions

if a finer grid is required or the map is large?


SOLUTION: Sampling-based algorithms

• Quick exploration of the space


• Entire space is mapped (not just the goal)
• Optimal path to any point is available

Environment is
dynamic
for motion planning and extended object tracking
Motion planning can help in Local environment is dynamic and
determining the overall path requires localized path alteration
for motion planning and extended object tracking
In most cases these dynamic objects in the environments are extended (larger than the
wavelength)
We need to know:
• Position
• Kinematics
• Size, shape,
orientation (extent)

Computing
and storing
shape is
challenging
xtended object tracking
he problem of partitioning
How many objects are there? Bike Assign a heuristic
criteria: Clustering
1. Max distance
between two points
2. Min points per group
3. A shape is attached
Car Pedestrian 4. Centroid has min
distance to all points

How many objects are


there? • Ambiguity in
clustering when
objects are close

• Incorrect clustering
can lead to
inefficient motion
planning

• SOLUTION: Maintain
xtended object tracking

1. Use the velocity and


orientation detection – using
radar/lidar

2. Predict next state of the


object

3. Predict the lidar


measurements based on the
next state

4. Match observations with


predictions and track clusters.
Any Questions?

Contact Us:
[email protected]
23
EXECUTIVE PROGRAMME IN ROBOTICS

You might also like