GRAPH ALGORITHM:
JUMP POINT SEARCH
ALGORITHM
I. Introduction
II. Jump Point Search Algorithm
III. Pseudocode of the Algorithm
IV. Step-by-step Example
V. Sample Code for the Algorithm
VI. Advantages and Disadvantages
VII. Conclusion
VIII. References
Design and Analysis of Algorithms (CS243-M)
Prof. Jan Eilbert L. Lee
Eric Jerard A. Collantes
BSCS – 2A
I. Introduction
The Jump Point Search algorithm, introduced by Daniel Harabor and Alban
Grastien, is an online symmetry-breaking that accelerates pathfinding on uniform-cost
grid maps by "jumping over" a large number of places that would normally require
explicit consideration (Harabor, 2011). By "jumping" over multiple nodes in each step and
taking advantage of the grid’s regularity, this algorithm outperforms the A* search
algorithm and effectively skips redundant searches (Hofkamp, 2015). In addition, this
algorithm focuses only on critical nodes that influence the path. Since the majority grid
nodes are too unimportant to be kept in an open or closed list, this algorithm updates the
open and closed lists considerably lesser as a result. Although it may search a wider
area, the authors assert that because updating the open and closed lists doesn't happen
as often, the overall gain is still significantly superior (Hofkamp, 2015).
Contrary to other similar algorithms, Jump Point Search does not require
preprocessing, incurs no memory overheads, and can be readily integrated with most
existing speedup techniques, such as memory heuristics and abstraction (Harabor,
2011).
II. Jump Point Search Algorithm
The Jump Point Search works by identifying "jump points," which are nodes
where the path's direction changes or where it's necessary to consider a different
direction due to obstacles. According to Harabor (2011), this algorithm can be explained
by two straightforward pruning rules applied recursively during the search process: one
rule pertains to straight steps, and the other to diagonal steps. The central concept for
both is the elimination of immediate neighbors around a node by demonstrating that an
optimal path, whether symmetric or not, can be traced from the current node's parent to
each neighbor without including the current node itself. There are two main ideas that
are crucial to this algorithm: the Neighbor Pruning idea and the Forced Neighbors idea.
Neighbor Pruning (Straight/Diagonal Pruning Rule)
1 2 3 1 2 3 Coming from a parent node, node x is in the
process of expansion. The line shows the
4 x 5 4 x 5
travel direction from its parent node (green-
6 7 8 6 7 8 colored), which can be either straight or
diagonal. In both scenarios, we can
promptly eliminate all grey neighbors since they are optimally accessible from x's parent
without going through node x itself (Harabor, 2011).
Forced Neighbors
1 2 3 1 2 3 Coming from a parent node, node x is in the
process of expansion. The line shows the
4 x 5 4 x 5
travel direction from its parent node (green-
6 7 8 6 7 8 colored), either straight or diagonal. Notice
that there is an obstacle adjacent to x.
Because of this, the highlighted neighbors of x (orange-colored) cannot be pruned since
the alternative optimal paths from the parent node to the highlighted nodes are blocked
(Harabor, 2011).
Rather than generating every natural and forced neighbor, the approach is to
recursively prune the neighbors surrounding each node. The goal is to remove
symmetries by recursively "jumping over" any nodes that can be optimally reached by a
path that does not go through the current node. The recursion halts upon encountering
an obstacle or identifying a jump point successor (Harabor, 2011). Jump points are
nodes that have inaccessible neighbors by an alternative symmetric path, making it so
that the path must visit the current node.
III. Pseudocode of the Algorithm
3.1 Identify Successors (Harabor & Grastien, 2011, p. 3)
Input: current node x
Input: start node s
Input: goal node g
Output: Set of successors
successors(x) = []
neighbors(x) = prune(x, neighbors(x))
for all n ∈ neighbors(x) do
n = jump(x, direction(x, n), s, g)
add n to successors(x)
end for
return successors(x)
3.2 Jump Function (Harabor & Grastien, 2011, p. 3)
Input: initial node x
Input: direction d
Input: start node s
Input: goal node g
Output: Next node in the path that is either a jump point or null
n = step(x, d)
if n is an obstacle or the edge of the grid then
return null
end if
if n == g then
return n
end if
if ∃𝑛′ ∈ neighbors(n) such that 𝑛′ is forced then
return n
end if
if d is diagonal then
for all i ∈ {1, 2} do
n = jump(x, direction(x, n), s, g)
if jump(n, di, s, g) is not null then
return n
end if
end for
end if
return jump(n, d, s, g)
3.3 Compute Diagonal First Path (Harabor & Grastien, 2011, p. 4)
Input: alternative optimal length path x
Output: Optimal length path
1. select an adjacent pair of edges appearing along x where (nk−1, nk) is a straight move
and (nk, nk+1) is a diagonal move
2. replace (nn−1, nk) and (nk, nk+1) with two new edges: (nk−1, n’k ), which is a diagonal
move and (n’k , nk+1) which is a straight move. The operation is successful if (nk−1, n’k )
and (n’k , nk+1) are both valid moves; i.e. node n’k is not an obstacle.
3. repeat lines 1 and 2, selecting and replacing adjacent edges, until no further changes
can be made to x.
4. return x
IV. Step-by-step Example
Using this following uniform-cost grid, find the shortest path from the starting (green-
colored) node to the goal (red-colored) node (Witmer, 2013). The grey-colored nodes are
the obstacles.
Step 1: Expand horizontally and expanding vertically.
This expansion ends up hitting the edge of the grid,
therefore we move forward to moving diagonally.
Step 2: Expand diagonally. Repeat step 1 until either
the horizontal or vertical expansion does not hit an
obstacle, the edge of the grid, or encounter a jump
point.
Step 3: After expanding diagonally, we encounter a
node that has a forced neighbor, making it a jump
point. We then add this node to the open set. Then we
proceed on expanding this best open node vertically
then horizontally.
Step 4: Expanding vertically, we hit the edge of the
grid. Horizontally, we encounter another node that has
a forced neighbor (jump point). We add this new node
to the open set. We proceed on the horizontal
expansion.
Step 5: Then, we visit the next best open node, which
is by expanding diagonally. In doing this, we hit the
edge of the grid, therefore we proceed by visiting the
next best open node which is by expanding diagonally,
where we also hit the edge of the grid.
Step 6: We proceed on the only best open node, which
is by expanding diagonally. Then, we expand vertically
and horizontally again, where we hit the edge on both
expansions.
Step 7: This time, we visit the only best open node by
expanding diagonally. We then encounter the goal
node, which is has an equal importance as finding a
jump point. Therefore, we add this node to the open
set. Then, we expand this last open node, reaching the
goal.
By adding the goal itself to the open set only to
recognize it as the goal, we have solved for the best
path.
V. Sample Code for the Algorithm
5.1 Jump Point Search Algorithm in Python
1 import heapq
2
3 def heuristic(a, b):
4 return abs(a[0] - b[0]) + abs(a[1] - b[1])
5 def get_neighbours(node, grid):
6 dirs = [(0, 1), (0, -1), (1, 0), (-1, 0), (1, 1), (-1, -1), (1, -1), (-1, 1)]
7 result = []
8 for dir in dirs:
9 next = (node[0] + dir[0], node[1] + dir[1])
10 if 0 <= next[0] < len(grid) and 0 <= next[1] < len(grid[0]) and
11 grid[next[0]][next[1]] == 0:
12 result.append(next)
13 return result
14 def jump_point_search(start, goal, grid):
15 open_list = []
16 closed_list = set()
17 g = {start: 0}
18 parents = {start: None}
19 heapq.heappush(open_list, (heuristic(start, goal), start))
20
21 while open_list:
22 _, current = heapq.heappop(open_list)
23
24 if current == goal:
25 path = []
26 while current is not None:
27 path.append(current)
28 current = parents[current]
29 path.reverse()
30 return path
31 closed_list.add(current)
32 for neighbour in get_neighbours(current, grid):
33 tentative_g_score = g[current] + heuristic(current, neighbour)
34 if neighbour in closed_list and tentative_g_score >= g.get(neighbour, 0):
35 continue
36 if tentative_g_score < g.get(neighbour, 0) or neighbour not in [i[1]for i in
37 open_list]:
38 parents[neighbour] = current
39 g[neighbour] = tentative_g_score
40 f_score = tentative_g_score + heuristic(neighbour, goal)
heapq.heappush(open_list, (f_score, neighbour))
return []
grid = [
[0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 0, 0],
[0, 0, 0, 0, 1, 0, 0],
[0, 0, 0, 0, 1, 0, 0]
]
start = (4, 0)
goal = (4, 6)
path = jump_point_search(start, goal, grid)
print(path)
for i in grid:
print(f"{i}")
print()
for step in path:
tempx = step[0]
tempy = step[1]
grid[tempx][tempy] = 2
for i in grid:
print(f"{i}")
VI. Advantages and Disadvantages
Advantages of Jump Point Search Disadvantages of Jump Point Search
• Efficiency: Jump Point Search • Complexity: The algorithm is more
algorithm reduces the number of complex to implement compared to
nodes expanded, speeding up the basic A*.
pathfinding process. • Grid Dependency: It is specifically
• Simplified Path Extraction: The designed for uniform-cost grids and
resulting path often requires fewer may not perform as well in non-grid
waypoints, making post-processing environments.
(e.g., smoothing) simpler. • Maintenance: More difficult to debug
• Directional Jumping: Efficiently and maintain due to the added
handles large open areas by jumping complexity of jump logic and forced
over multiple nodes in one step. neighbor checks.
• Reduced Memory Usage: Fewer
• Diagonal Movement Handling:
nodes are processed and stored
Handling diagonal movements and
compared to A*, leading to lower
memory consumption and overhead. forced neighbors can be particularly
• Integration with Heuristics: Can be complex, requiring careful
easily combined with different heuristic implementation.
functions to further enhance
• Real-Time Constraints: In real-time
pathfinding efficiency.
applications with dynamically changing
grids, the algorithm might need to
frequently recompute paths, reducing
its efficiency.
VII. Conclusion
Jump Point Search is an efficient pathfinding algorithm that significantly
optimizes the performance of the standard A* algorithm by reducing the number of nodes
expanded, particularly in uniform-cost grid environments. It excels in scenarios where
large open areas allow for substantial jumps, resulting in faster pathfinding and reduced
memory usage. However, its complexity and dependency on static or uniform grid-based
structures make it less suitable for non-grid, highly irregular, or dynamic environments.
Despite these limitations, this algorithm still maintains the optimality of the paths
found and offers significant practical benefits in applications such as games, robotics,
and geographic information systems (GIS). The algorithm's ability to integrate with
various heuristics and potential for parallel processing further improve its appeal. While
implementation and maintenance may pose challenges, the advantages often outweigh
these difficulties, making Jump Point Search a valuable tool for efficient and effective
pathfinding.
VIII. References
Harabor, D. (2011, September 14). Jump Point Search. Shortest Path. Retrieved May
29, 2024, from https://harablog.wordpress.com/2011/09/07/jump-point-search/
Harabor, D., & Grastien, A. (2011). Online Graph Pruning for Pathfinding on Grid Maps.
Proceedings of the AAAI Conference on Artificial Intelligence.
https://doi.org/10.1609/aaai.v25i1.7994
Hofkamp, A. (2015, October 31). Jump point search: Fast A* pathfinding for uniform cost
grids. GameDev.net. https://www.gamedev.net/tutorials/programming/artificial-
intelligence/jump-point-search-fast-a-pathfinding-for-uniform-cost-grids-r4220/
Witmer, N. (2013, May 1). A visual explanation of jump point search. Retrieved May 30,
2024, from https://zerowidth.com/2013/a-visual-explanation-of-jump-point-search/