15-451 Algorithms 10/07/09
* All-pairs shortest paths
- matrix method (recap from end of lecture)
- Floyd-Warshall (didn't get to in lecture)
==========================
ALL-PAIRS SHORTEST PATHS
------------------------
Say we want to compute length of shortest path between *every* pair of
vertices. If we use Bellman-Ford for all n possible starts S, this
will take time O(mn^2). Here is a nice algorithm that uses the matrix
representation of a graph, and runs in time O(n^3 log n).
ALL-PAIRS SHORTEST PATHS via MATRIX PRODUCTS
--------------------------------------------
Given graph G, define matrix A(G) as follows:
- A[i,i] = 0 for all i.
- if there is an edge from i to j, then A[i,j] = length of that edge.
- otherwise, A[i,j] = infinity.
I.e., A[i,j] = length of shortest path from i to j using 1 or fewer edges.
Now, following the basic Dynamic Programming idea, can we use this to
produce a new matrix B where B[i,j] = length of the shortest path from
i to j using 2 or fewer edges?
Answer: yes. B[i,j] = min_k (A[i,k] + A[k,j])
[think about why this is true]
I.e., what we want to do is compute a matrix product B = A*A except we
change "*" to "+" and we change "+" to "min".
In other words, instead of computing the sum of products, we compute
the min of sums. What if we now want to get shortest paths using 4 or
fewer edges? Just need to compute C = B*B. I.e., to get from i to j
using 4 or fewer edges, we need to go from i to some k using 2 or
fewer edges, and then from k to j using 2 or fewer edges.
Just need to keep squaring O(log n) times.
Running time: O(n^3 log(n)). Need to do log(n) matrix multiplies to
get all paths of length n or less and standard matrix multiply takes n^3
operations.
==============================================================================
All-pairs shortest paths via Floyd-Warshall.
Here is an algorithm that shaves off the O(log n) and runs in time
O(n^3). The idea is that instead of increasing the *number* of edges in
the path, we'll increase the set of vertices we allow as intermediate
nodes in the path. In other words, starting from the same base case
(the shortest path that uses no intermediate nodes), we'll then go on
to the shortest path that's allowed to use node 1 as an intermediate
node. Then the shortest path that's allowed to use {1,2} as
intermediate nodes, and so on.
Let A be the same adjacency matrix for the graph we used earlier
(A[i][j] = length of edge (i,j) if the edge exists, infinity if it
doesn't exist, and 0 on the diagonal.)
/* after each iteration of the outside loop, A[i][j] = length of
shortest i->j path that's allowed to use intermediate vertices in
the set 1..k */
for k = 1 to n do:
for each i,j do:
A[i][j] = min( A[i][j], (A[i][k] + A[k][j]);
i.e., you either go through k or you don't. Total time is O(n^3).
What's amazing here is how compact and simple the code is!
===================================================================