OR 215 Spring 1998
Network Flows M. Hartmann
MINIMUM COST SPANNING TREE PROBLEM II
Prim’s Algorithm
Sensitivity Analysis
OPTIMALITY CONDITIONS
Theorem. (Cut optimality theorem) A spanning tree T* is a minimum spanning tree if and only if it satisfies the following cut optimality condition:

For every tree arc (i,j) ^{} T*, c_{ij} c_{kl} for every arc (k,l) contained in the cut formed by deleting arc (i,j) from T*.
Theorem. (Path optimality theorem) A spanning tree T* is a minimum spanning tree if and only if it satisfies the following path optimality condition:

For every nontree arc (k,l) of G, c_{ij} c_{kl} for every arc (i,j) contained in the path of T* connecting nodes k and l.
Recall: The path optimality condition leads to Kruskal’s greedy algorithm.
Theorem. Let F be a subset of arcs of some minimum cost spanning tree. Let P be the set of nodes of some component of F. Let (i,j) be a minimum cost arc in the cut (P,N\P). Then F+(i,j) is a subset of arcs of some minimum cost spanning tree.
Proof: Suppose that F is a subset of the minimum cost tree T*. If (i,j) ^{} T*, there is nothing to prove. So suppose that (i,j) ^{} T*. Adding (i,j) to T* creates a cycle C, and C has at least on arc (k,l) (i,j) in (P,N\P). (Why?) By assumption, c_{ij} c_{kl}. Also, T* satisfies the cut optimality conditions, so c_{ij} c_{kl}. It follows that c_{ij} = c_{kl}, and that
T*+(i,j)\ (k,l) is also a minimum cost spanning tree.

How can we take advantage of this property?
PRIM’S ALGORITHM
Prim's algorithm grows a spanning subtree rooted at node 1, which is a subset of arcs of some minimum cost spanning tree.
To identify the arc (i,j), Prim’s algorithm maintains labels d(j) for each node j not yet added to the spanning tree:
d(j) = minimum cost of an arc (i,j) with i ^{} P
Prim’s algorithm also keeps track of which node i ^{} P has c_{ij} = d(j) by setting pred(j) := i.
algorithm PRIM;
begin
P := {1} ; T := N \ {1}; F := ;
d(1) := 0 and pred(1) := ;
d(j) := c_{1j}_{ } and pred(j) := 1 for all (1,j) ^{} A ,
and d(j) := otherwise;
while P N do
begin
{ node selection, also called FINDMIN }
let i ^{} T be a node with d(i) = min { d(j) : j ^{}^{ }T };
P := P { i }; T := T \ { i }; add (i,pred(i)) to F;
{ cost update }
for each (i,j) ^{} A(i) do
if d(j) > c_{ij}_{ }_{ }then d(j) := c_{ij}_{ }_{ }and pred(j) := i;
end;
end

Where have we seen this algorithm before?
algorithm DIJKSTRA;
begin
P := {1} ; T := N \ {1};
d(1) := 0 and pred(1) := ;
d(j) := c_{1j} and pred(j) := 1 for all (1,j) A ,
and d(j) := otherwise;
while P N do
begin
{ node selection, also called FINDMIN }
select a node i T with d(i) = min { d(j) : j T };
P := P { i }; T := T \ { i };
{ distance update }
for each (i,j) A(i) do
if d(j) > d(i) + c_{ij }_{ }then
d(j) := d(i) + c_{ij }_{ }and pred(j) := i ;
end;
end
RUNNING TIME
Naïve Implementation:
Prim’s algorithm can be implemented using heaps.
Recall that the standard heap implementation has the following running times per operation:
FINDMIN: O(1)
DECREASEKEY: O(log n)
DELETEMIN: O(log n)
INSERT: O(log n)
Heap Implementation:
Note: Prim’s algorithm can also be implemented using Fibonacci heaps in O(m+n log n) time.
EXAMPLE
The set F of arcs is "grown" starting from node 1:
1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

1 2 3 4 5

35
20
25
10
30
15
40
1 2 3 4 5

Suppose that T* is a minimum cost spanning tree. For any arc (i,j) A, the cost interval of (i,j) is the range of values of c_{ij }for which T* continues to be a minimum cost spanning tree.

How can we find the cost interval of an arc (i,j) T*?

How can we find the cost interval of an arc (i,j) T*?
COMPREHENSIVE SENSITIVITY ANALYSIS 
How much time does it take to find the cost intervals for every arc (i,j) T*?
maxcost arc in path from i to j

In the homework, you will show that all of these values can be computed in O(n^{2}) time.

How much time does it take to find the cost intervals for every arc (i,j) T*?
SUMMARY

The minimum cost spanning tree (MST) problem is an important problem in its own right.

The MST problem is also an important subproblem.

The greedy algorithm works.

Other very efficient algorithms.

Sensitivity analysis can be performed efficiently.
