Floyd-Warshall算法描述
1)適用范圍:
a)APSP(All Pairs Shortest Paths)
b)稠密圖效果最佳
c)邊權(quán)可正可負(fù)
2)算法描述:
a)初始化:dis[u,v]=w[u,v]
b)For k:=1 to n
For i:=1 to n
For j:=1 to n
If dis[i,j]>dis[i,k]+dis[k,j] Then
Dis[I,j]:=dis[I,k]+dis[k,j]
c)算法結(jié)束:dis即為所有點(diǎn)對(duì)的最短路徑矩陣
3)算法小結(jié):此算法簡(jiǎn)單有效,由于三重循環(huán)結(jié)構(gòu)緊湊,對(duì)于稠密圖,效率要高于執(zhí)行|V|次Dijkstra算法。時(shí)間復(fù)雜度O(n^3)。
考慮下列變形:如(I,j)∈E則dis[I,j]初始為1,else初始為0,這樣的Floyd算法最后的最短路徑矩陣即成為一個(gè)判斷I,j是否有通路的矩陣。更簡(jiǎn)單的,我們可以把dis設(shè)成boolean類型,則每次可以用“dis[I,j]:=dis[I,j]or(dis[I,k]and dis[k,j])”來(lái)代替算法描述中的藍(lán)色部分,可以更直觀地得到I,j的連通情況。
標(biāo)簽:
Floyd-Warshall
Shortest
Pairs
Paths
上傳時(shí)間:
2013-12-01
上傳用戶:dyctj
最新的支持向量機(jī)工具箱,有了它會(huì)很方便 1. Find time to write a proper list of things to do! 2. Documentation. 3. Support Vector Regression. 4. Automated model selection. REFERENCES ========== [1] V.N. Vapnik, "The Nature of Statistical Learning Theory", Springer-Verlag, New York, ISBN 0-387-94559-8, 1995. [2] J. C. Platt, "Fast training of support vector machines using sequential minimal optimization", in Advances in Kernel Methods - Support Vector Learning, (Eds) B. Scholkopf, C. Burges, and A. J. Smola, MIT Press, Cambridge, Massachusetts, chapter 12, pp 185-208, 1999. [3] T. Joachims, "Estimating the Generalization Performance of a SVM Efficiently", LS-8 Report 25, Universitat Dortmund, Fachbereich Informatik, 1999.
標(biāo)簽:
支持向量機(jī)
工具箱
上傳時(shí)間:
2013-12-16
上傳用戶:亞亞娟娟123