Source code online books for professionals by professionals



Download 4,67 Mb.
Pdf ko'rish
bet165/266
Sana31.12.2021
Hajmi4,67 Mb.
#213682
1   ...   161   162   163   164   165   166   167   168   ...   266
Bog'liq
2 5296731884800181221

Listing 9-4.  Johnson’s Algorithm
from copy import deepcopy
 
def johnson(G):                                 # All pairs shortest paths
    G = deepcopy(G)                             # Don't want to break original
    s = object()                                # Guaranteed unused node
    G[s] = {v:0 for v in G}                     # Edges from s have zero wgt
    h, _ = bellman_ford(G, s)                   # h[v]: Shortest dist from s
    del G[s]                                    # No more need for s
    for u in G:                                 # The weight from u ...
6
As you can see, I just instantiate object to create the node s. Each such instance is unique (that is, they aren’t equal under ==), 
which makes them useful for added dummy nodes, as well as other forms of sentinel objects, which need to be different from  
all legal values.


Chapter 9 

 From a to B with edsger and Friends
198
        for v in G[u]:                          # ... to v ...
            G[u][v] += h[u] - h[v]              # ... is adjusted (nonneg.)
    D, P = {}, {}                               # D[u][v] and P[u][v]
    for u in G:                                 # From every u ...
        D[u], P[u] = dijkstra(G, u)             # ... find the shortest paths
        for v in G:                             # For each destination ...
            D[u][v] += h[v] - h[u]              # ... readjust the distance
    return D, P                                 # These are two-dimensional 
Note
 

  there is no need to check whether the call to 
bellman_ford
 succeeded or whether it found a negative cycle  
(in which case Johnson’s algorithm won’t work), because if there is a negative cycle in the graph
bellman_ford
 would 
raise an exception.
Assuming the Q(m lg n) running time for Dijkstra’s algorithm, Johnson’s is simply a factor of n slower, giving us 
Q(mn lg n), which is faster than the cubic running time of Floyd-Warshall (discussed in a bit), for sparse graphs  
(that is, for graphs with relatively few edges).
7
The transform used in Johnson’s algorithm closely related to the potential function of the A* algorithm  
(see “Knowing Where You’re Going,” later in this chapter), and it is similar to the transform used in the min-cost 
bipartite matching problem in Chapter 10. There, too, the goal is to ensure positive edge weights but in a slightly 
different situation (edge weights changing from iteration to iteration).
Far-Fetched Subproblems
While Dijkstra’s algorithm is certainly based on the principles of dynamic programming, the fact is partly obscured 
by the need to discover the ordering of (or dependencies between) subproblems on the go. The algorithm I discuss 
in this section, discovered independently by Roy, Floyd, and Warshall, is a prototypical example of DP. It is based on 
a memoized recursive decomposition and is iterative in its common implementation. It is deceptively simple in form 
but devilishly clever in design. It is, in some ways, based on the “in or out” principle discussed in Chapter 8, but the 
resulting subproblems may, at least at first glance, seem highly artificial and far-fetched.
In many DP problems, we might need to hunt a bit for a set of recursively related subproblems, but once we 
find them, they often seem quite natural. Just think of the nodes in DAG shortest path, for example, or the prefix 
pairs of the longest common subsequence problem. The latter illustrates a useful principle that can be extended to 
less obvious structures, though: restricting which elements we’re allowed to work with. In the LCS problem, we’re 
restricting the lengths of prefixes, for example. In the knapsack problem, this is slightly more artificial: We invent 
an ordering for the objects and restrict ourselves to the k first ones. The subproblem is then parametrized by this 
“permitted set” and a portion of the knapsack capacity.
In the all-pairs shortest path problem, we can use this form of restriction, along with the “in or out” principle, to design 
a set of nonobvious subproblems: We arbitrarily order the nodes and restrict how many—that is, the k first—we’re allowed 
to use as intermediate nodes in forming our paths. We have now parametrized our subproblems using three parameters:
The starting node

The ending node

The highest node number we’re allowed to pass through

7
A common criterion for calling a graph sparse is that m is O(n), for example. In this case, though, Johnson’s will (asymptotically) 
match Floyd-Warshall as long as m is O(n
2
/lg n), which allows for quite a lot of edges. On the other hand, Floyd-Warshall has very 
low constant overhead.


Chapter 9 

 From a to B with edsger and Friends
199
Unless you had some idea where we were going with this, adding the third item might seem totally 
unproductive—how could it help us to restrict what we’re allowed to do? As I’m sure you can see, the idea is to 
partition the solution space, decomposing the problem into subproblems and then linking these into a subproblem 
graph. The linking is achieved by creating a recursive dependency based on the “in or out” idea: node k, in or out?
Let d(uvk) be the length of the shortest path that exists from node u to node v if you’re only allowed to use the k 
first nodes as intermediate nodes. We can decompose the problem as follows:
d(uvk) = min(d(uvk−1), d(ukk−1) + d(kvk−1))
Like in the knapsack problem, we’re considering whether to include k. If we don’t include it, we simply use the 
existing solution, the shortest path we could find without using k, which is d(uvk−1). If we do include it, we must use 
the shortest path to k (which is d(ukk−1)) as well as the shortest path from k (which is d(kvk−1)). Note that in all 
these three subproblems, we’re working with the k−1 first nodes, because either we’re excluding k or we’re explicitly 
using it as an endpoint and not an intermediate node. This guarantees us a size-ordering (that is, a topological 
sorting) of the subproblems—no cycles.
You can see the resulting algorithm in Listing 9-5. (The implementation uses the memo decorator from Chapter 8.) 
Note that I’m assuming the nodes are integers in the range 1...n here. If you’re using other node objects, you could have 
a list V containing the nodes in some arbitrary order and then use V[k-1] and V[k-2] instead of k and k-1 in the min 
part. Also note that the returned D map has the form D[u,v] rather than D[u][v]. I’m also assuming that this is a full 
weight matrix, so D[u][v] is inf if there is no edge from u to v. You could easily modify all of this, if you want.

Download 4,67 Mb.

Do'stlaringiz bilan baham:
1   ...   161   162   163   164   165   166   167   168   ...   266




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©hozir.org 2024
ma'muriyatiga murojaat qiling

kiriting | ro'yxatdan o'tish
    Bosh sahifa
юртда тантана
Боғда битган
Бугун юртда
Эшитганлар жилманглар
Эшитмадим деманглар
битган бодомлар
Yangiariq tumani
qitish marakazi
Raqamli texnologiyalar
ilishida muhokamadan
tasdiqqa tavsiya
tavsiya etilgan
iqtisodiyot kafedrasi
steiermarkischen landesregierung
asarlaringizni yuboring
o'zingizning asarlaringizni
Iltimos faqat
faqat o'zingizning
steierm rkischen
landesregierung fachabteilung
rkischen landesregierung
hamshira loyihasi
loyihasi mavsum
faolyatining oqibatlari
asosiy adabiyotlar
fakulteti ahborot
ahborot havfsizligi
havfsizligi kafedrasi
fanidan bo’yicha
fakulteti iqtisodiyot
boshqaruv fakulteti
chiqarishda boshqaruv
ishlab chiqarishda
iqtisodiyot fakultet
multiservis tarmoqlari
fanidan asosiy
Uzbek fanidan
mavzulari potok
asosidagi multiservis
'aliyyil a'ziym
billahil 'aliyyil
illaa billahil
quvvata illaa
falah' deganida
Kompyuter savodxonligi
bo’yicha mustaqil
'alal falah'
Hayya 'alal
'alas soloh
Hayya 'alas
mavsum boyicha


yuklab olish