亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? recoverylogmapper.html

?? 網絡爬蟲開源代碼
?? HTML
?? 第 1 頁 / 共 2 頁
字號:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"><html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en"><head><meta http-equiv="content-type" content="text/html; charset=UTF-8" /><title>RecoveryLogMapper xref</title><link type="text/css" rel="stylesheet" href="../../../../stylesheet.css" /></head><body><div id="overview"><a href="../../../../../apidocs/org/archive/crawler/util/RecoveryLogMapper.html">View Javadoc</a></div><pre><a name="1" href="#1">1</a>   <em class="comment">/*<em class="comment"> RecoveryLogMapper.java</em></em><a name="2" href="#2">2</a>   <em class="comment">*</em><a name="3" href="#3">3</a>   <em class="comment">* $Id: RecoveryLogMapper.java 4647 2006-09-22 18:39:39Z paul_jack $</em><a name="4" href="#4">4</a>   <em class="comment">*</em><a name="5" href="#5">5</a>   <em class="comment">* Created on Mar 7, 2005</em><a name="6" href="#6">6</a>   <em class="comment">*</em><a name="7" href="#7">7</a>   <em class="comment">* Copyright (C) 2005 Mike Schwartz.</em><a name="8" href="#8">8</a>   <em class="comment">*</em><a name="9" href="#9">9</a>   <em class="comment">* This file is part of the Heritrix web crawler (crawler.archive.org).</em><a name="10" href="#10">10</a>  <em class="comment">*</em><a name="11" href="#11">11</a>  <em class="comment">* Heritrix is free software; you can redistribute it and/or modify</em><a name="12" href="#12">12</a>  <em class="comment">* it under the terms of the GNU Lesser Public License as published by</em><a name="13" href="#13">13</a>  <em class="comment">* the Free Software Foundation; either version 2.1 of the License, or</em><a name="14" href="#14">14</a>  <em class="comment">* any later version.</em><a name="15" href="#15">15</a>  <em class="comment">*</em><a name="16" href="#16">16</a>  <em class="comment">* Heritrix is distributed in the hope that it will be useful,</em><a name="17" href="#17">17</a>  <em class="comment">* but WITHOUT ANY WARRANTY; without even the implied warranty of</em><a name="18" href="#18">18</a>  <em class="comment">* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the</em><a name="19" href="#19">19</a>  <em class="comment">* GNU Lesser Public License for more details.</em><a name="20" href="#20">20</a>  <em class="comment">*</em><a name="21" href="#21">21</a>  <em class="comment">* You should have received a copy of the GNU Lesser Public License</em><a name="22" href="#22">22</a>  <em class="comment">* along with Heritrix; if not, write to the Free Software</em><a name="23" href="#23">23</a>  <em class="comment">* Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA  02111-1307  USA</em><a name="24" href="#24">24</a>  <em class="comment">*/</em><a name="25" href="#25">25</a>  <a name="26" href="#26">26</a>  <em>/**<em>*</em></em><a name="27" href="#27">27</a>  <em> * Parses a Heritrix recovery log file (recover.gz), and builds maps</em><a name="28" href="#28">28</a>  <em> * that allow a caller to look up any seed URL and get back an Iterator of all</em><a name="29" href="#29">29</a>  <em> * URLs successfully crawled from given seed.</em><a name="30" href="#30">30</a>  <em> *</em><a name="31" href="#31">31</a>  <em> * Also allows lookup on any crawled</em><a name="32" href="#32">32</a>  <em> * URL to find the seed URL from which the crawler reached that URL (through 1</em><a name="33" href="#33">33</a>  <em> * or more discovered URL hops, which are collapsed in this lookup).</em><a name="34" href="#34">34</a>  <em> * </em><a name="35" href="#35">35</a>  <em> * &lt;p>This code creates some fairly large collections (proprotionate in size to</em><a name="36" href="#36">36</a>  <em> * # discovered URLs) so make sure you allocate</em><a name="37" href="#37">37</a>  <em> * it a large heap to work in. It also takes a while to process a recover log.</em><a name="38" href="#38">38</a>  <em> * &lt;p>See {@link #main()} method at end for test/demo code.</em><a name="39" href="#39">39</a>  <em> * @author Mike Schwartz, schwartz at CodeOnTheRoad dot com</em><a name="40" href="#40">40</a>  <em> */</em><a name="41" href="#41">41</a>  <strong>package</strong> <a href="../../../../org/archive/crawler/util/package-summary.html">org.archive.crawler.util</a>;<a name="42" href="#42">42</a>  <a name="43" href="#43">43</a>  <strong>import</strong> org.archive.crawler.frontier.RecoveryJournal;<a name="44" href="#44">44</a>  <a name="45" href="#45">45</a>  <strong>import</strong> java.io.File;<a name="46" href="#46">46</a>  <strong>import</strong> java.io.LineNumberReader;<a name="47" href="#47">47</a>  <strong>import</strong> java.io.PrintWriter;<a name="48" href="#48">48</a>  <strong>import</strong> java.io.FileOutputStream;<a name="49" href="#49">49</a>  <strong>import</strong> java.util.Collection;<a name="50" href="#50">50</a>  <strong>import</strong> java.util.HashMap;<a name="51" href="#51">51</a>  <strong>import</strong> java.util.HashSet;<a name="52" href="#52">52</a>  <strong>import</strong> java.util.Iterator;<a name="53" href="#53">53</a>  <strong>import</strong> java.util.Map;<a name="54" href="#54">54</a>  <strong>import</strong> java.util.Set;<a name="55" href="#55">55</a>  <strong>import</strong> java.util.logging.Level;<a name="56" href="#56">56</a>  <strong>import</strong> java.util.logging.Logger;<a name="57" href="#57">57</a>  <a name="58" href="#58">58</a>  <strong>public</strong> <strong>class</strong> <a href="../../../../org/archive/crawler/util/RecoveryLogMapper.html">RecoveryLogMapper</a> {<a name="59" href="#59">59</a>      <strong>private</strong> <strong>static</strong> <strong>final</strong> <strong>char</strong> LOG_LINE_START_CHAR =<a name="60" href="#60">60</a>          RecoveryJournal.F_ADD.charAt(0);<a name="61" href="#61">61</a>      <strong>private</strong> <strong>static</strong> <strong>final</strong> Logger logger =<a name="62" href="#62">62</a>          Logger.getLogger(RecoveryLogMapper.<strong>class</strong>.getName());<a name="63" href="#63">63</a>      <strong>private</strong> PrintWriter seedNotFoundPrintWriter = <strong>null</strong>;<a name="64" href="#64">64</a>  <a name="65" href="#65">65</a>      <em>/**<em>*</em></em><a name="66" href="#66">66</a>  <em>     * Tracks seed for each crawled URL</em><a name="67" href="#67">67</a>  <em>     */</em><a name="68" href="#68">68</a>      <strong>private</strong> Map&lt;String,String> crawledUrlToSeedMap<a name="69" href="#69">69</a>       = <strong>new</strong> HashMap&lt;String,String>();<a name="70" href="#70">70</a>  <a name="71" href="#71">71</a>      <em>/**<em>*</em></em><a name="72" href="#72">72</a>  <em>     * Maps seed URLs to Set of discovered URLs</em><a name="73" href="#73">73</a>  <em>     */</em><a name="74" href="#74">74</a>      <strong>private</strong> Map&lt;String,Set&lt;String>> seedUrlToDiscoveredUrlsMap<a name="75" href="#75">75</a>       = <strong>new</strong> HashMap&lt;String,Set&lt;String>>();<a name="76" href="#76">76</a>  <a name="77" href="#77">77</a>      <em>/**<em>*</em></em><a name="78" href="#78">78</a>  <em>     * Tracks which URLs were successfully crawled</em><a name="79" href="#79">79</a>  <em>     */</em><a name="80" href="#80">80</a>      <strong>private</strong> Set&lt;String> successfullyCrawledUrls = <strong>new</strong> HashSet&lt;String>();<a name="81" href="#81">81</a>  <a name="82" href="#82">82</a>       <em>/**<em>*</em></em><a name="83" href="#83">83</a>  <em>     * Normal constructor - if encounter not-found seeds while loading</em><a name="84" href="#84">84</a>  <em>     * recoverLogFileName, will throw throw SeedUrlNotFoundException.</em><a name="85" href="#85">85</a>  <em>     * Use {@link #RecoveryLogMapper(String)} if you want to just log</em><a name="86" href="#86">86</a>  <em>     * such cases and keep going.  (Those should not happen if the</em><a name="87" href="#87">87</a>  <em>     * recover log is written correctly, but we see them in pratice.)</em><a name="88" href="#88">88</a>  <em>     * @param recoverLogFileName</em><a name="89" href="#89">89</a>  <em>     * @throws java.io.FileNotFoundException </em><a name="90" href="#90">90</a>  <em>     * @throws java.io.IOException </em><a name="91" href="#91">91</a>  <em>     * @throws SeedUrlNotFoundException </em><a name="92" href="#92">92</a>  <em>     */</em><a name="93" href="#93">93</a>      <strong>public</strong> <a href="../../../../org/archive/crawler/util/RecoveryLogMapper.html">RecoveryLogMapper</a>(String recoverLogFileName)<a name="94" href="#94">94</a>      throws java.io.FileNotFoundException, java.io.IOException,<a name="95" href="#95">95</a>              <a href="../../../../org/archive/crawler/util/SeedUrlNotFoundException.html">SeedUrlNotFoundException</a> {<a name="96" href="#96">96</a>          load(recoverLogFileName);<a name="97" href="#97">97</a>      }<a name="98" href="#98">98</a>  <a name="99" href="#99">99</a>      <em>/**<em>*</em></em><a name="100" href="#100">100</a> <em>     * Constructor to use if you want to allow not-found seeds, logging</em><a name="101" href="#101">101</a> <em>     * them to seedNotFoundLogFileName.  In contrast, {@link</em><a name="102" href="#102">102</a> <em>     * #RecoveryLogMapper(String)} will throw SeedUrlNotFoundException</em><a name="103" href="#103">103</a> <em>     * when a seed isn't found.</em><a name="104" href="#104">104</a> <em>     * @param recoverLogFileName</em><a name="105" href="#105">105</a> <em>     * @param seedNotFoundLogFileName</em><a name="106" href="#106">106</a> <em>     */</em><a name="107" href="#107">107</a>     <strong>public</strong> <a href="../../../../org/archive/crawler/util/RecoveryLogMapper.html">RecoveryLogMapper</a>(String recoverLogFileName,<a name="108" href="#108">108</a>                              String seedNotFoundLogFileName)<a name="109" href="#109">109</a>         throws java.io.FileNotFoundException, java.io.IOException,<a name="110" href="#110">110</a>                <a href="../../../../org/archive/crawler/util/SeedUrlNotFoundException.html">SeedUrlNotFoundException</a> {<a name="111" href="#111">111</a>         seedNotFoundPrintWriter = <strong>new</strong> PrintWriter(<strong>new</strong> FileOutputStream(<a name="112" href="#112">112</a>                seedNotFoundLogFileName));<a name="113" href="#113">113</a>         load(recoverLogFileName);<a name="114" href="#114">114</a>     }<a name="115" href="#115">115</a> <a name="116" href="#116">116</a>     <strong>protected</strong> <strong>void</strong> load(String recoverLogFileName)<a name="117" href="#117">117</a>     throws java.io.FileNotFoundException, java.io.IOException,<a name="118" href="#118">118</a>             <a href="../../../../org/archive/crawler/util/SeedUrlNotFoundException.html">SeedUrlNotFoundException</a> {<a name="119" href="#119">119</a>         LineNumberReader reader = <strong>new</strong> LineNumberReader(RecoveryJournal.<a name="120" href="#120">120</a>             getBufferedReader(<strong>new</strong> File(recoverLogFileName)));<a name="121" href="#121">121</a>         String curLine = <strong>null</strong>;<a name="122" href="#122">122</a>         <strong>while</strong> ((curLine = reader.readLine()) != <strong>null</strong>) {<a name="123" href="#123">123</a>             <strong>if</strong> (curLine.length() == 0<a name="124" href="#124">124</a>                     || curLine.charAt(0) != LOG_LINE_START_CHAR) {<a name="125" href="#125">125</a>                 <strong>continue</strong>;<a name="126" href="#126">126</a>             }<a name="127" href="#127">127</a>             String args[] = curLine.split(<span class="string">"&#47;&#47;s+"</span>);<a name="128" href="#128">128</a>             <strong>int</strong> curLineNumWords = args.length;<a name="129" href="#129">129</a>             String firstUrl = args[1];<a name="130" href="#130">130</a>             <em class="comment">// Ignore DNS log entries</em><a name="131" href="#131">131</a>             <strong>if</strong> (firstUrl.startsWith(<span class="string">"dns:"</span>)) {<a name="132" href="#132">132</a>                 <strong>continue</strong>;<a name="133" href="#133">133</a>             }<a name="134" href="#134">134</a>             <strong>if</strong> (curLine.startsWith(RecoveryJournal.F_ADD)) {<a name="135" href="#135">135</a>                 <em class="comment">// Seed URL</em><a name="136" href="#136">136</a>                 <strong>if</strong> (curLineNumWords == 2) {<a name="137" href="#137">137</a>                     <strong>if</strong> (logger.isLoggable(Level.FINE)) {<a name="138" href="#138">138</a>                         logger.fine(<span class="string">"F_ADD with 2 words --> seed URL ("</span> +<a name="139" href="#139">139</a>                             firstUrl + <span class="string">")"</span>);<a name="140" href="#140">140</a>                     }<a name="141" href="#141">141</a>                     <em class="comment">// Add seed the first time we find it</em><a name="142" href="#142">142</a>                     <strong>if</strong> (seedUrlToDiscoveredUrlsMap.get(firstUrl) == <strong>null</strong>) {<a name="143" href="#143">143</a>                         seedUrlToDiscoveredUrlsMap.put(firstUrl,<a name="144" href="#144">144</a>                             <strong>new</strong> HashSet&lt;String>());<a name="145" href="#145">145</a>                     }<a name="146" href="#146">146</a>                 } <strong>else</strong> {<a name="147" href="#147">147</a>                     <em class="comment">// URL found via an earlier seeded / discovered URL</em><a name="148" href="#148">148</a>                     <em class="comment">// Look for the seed from which firstUrlString came, so</em><a name="149" href="#149">149</a>                     <em class="comment">// we can collapse new URLString back to it</em><a name="150" href="#150">150</a>                     String viaUrl = args[curLineNumWords - 1];<a name="151" href="#151">151</a>                     <strong>if</strong> (logger.isLoggable(Level.FINE)) {<a name="152" href="#152">152</a>                         logger.fine(<span class="string">"F_ADD with 3+ words --> new URL "</span><a name="153" href="#153">153</a>                                 + firstUrl + <span class="string">" via URL "</span> + viaUrl);<a name="154" href="#154">154</a>                     }<a name="155" href="#155">155</a>                     String seedForFirstUrl =<a name="156" href="#156">156</a>                         (String) crawledUrlToSeedMap.get(viaUrl);<a name="157" href="#157">157</a>                     <em class="comment">// viaUrlString is a seed URL</em><a name="158" href="#158">158</a>                     <strong>if</strong> (seedForFirstUrl == <strong>null</strong>) {<a name="159" href="#159">159</a>                         <strong>if</strong> (logger.isLoggable(Level.FINE)) {<a name="160" href="#160">160</a>                             logger.fine(<span class="string">"\tvia URL is a seed"</span>);<a name="161" href="#161">161</a>                         }

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
中文字幕一区日韩精品欧美| 欧美在线视频日韩| 欧美videofree性高清杂交| 天堂在线亚洲视频| 欧美亚洲国产怡红院影院| 亚洲国产成人av好男人在线观看| 欧美综合色免费| 亚洲国产一区在线观看| 制服丝袜中文字幕亚洲| 美腿丝袜亚洲三区| 久久综合九色综合97婷婷女人 | 欧美r级在线观看| 极品少妇xxxx精品少妇偷拍| 久久久久国产一区二区三区四区 | 美女性感视频久久| 欧美成人精精品一区二区频| 国产伦精品一区二区三区视频青涩 | 国产三级精品在线| 91色视频在线| 日韩精品每日更新| 久久久高清一区二区三区| 不卡视频一二三| 亚洲国产日韩精品| 日韩免费视频一区| caoporen国产精品视频| 亚洲成av人片在www色猫咪| 日韩欧美电影一二三| proumb性欧美在线观看| 午夜影院久久久| 国产日韩一级二级三级| 欧美日韩日日摸| 国产激情一区二区三区| 亚洲一级电影视频| 久久久亚洲欧洲日产国码αv| 97久久超碰国产精品| 日韩国产精品91| 中文字幕一区二区三区不卡| 91成人在线免费观看| 狠狠色综合播放一区二区| 有码一区二区三区| 久久久www免费人成精品| 欧美图区在线视频| 成人免费毛片a| 美女任你摸久久| 洋洋成人永久网站入口| 国产女同性恋一区二区| 欧美日韩成人在线一区| 99精品视频一区二区三区| 久久www免费人成看片高清| 一区二区三区日韩在线观看| 国产亚洲精久久久久久| 91精品久久久久久久99蜜桃| 一本到高清视频免费精品| 成人午夜看片网址| 蜜桃在线一区二区三区| 午夜久久久久久久久| 亚洲欧美日韩国产一区二区三区| 久久久久久一级片| 欧美一级黄色大片| 欧美欧美欧美欧美| 在线一区二区三区四区| av成人老司机| 国产不卡视频一区| 国产一区二区三区电影在线观看| 日韩黄色免费电影| 亚洲电影一区二区| 亚洲美女视频一区| 亚洲女人的天堂| 国产精品视频麻豆| 国产欧美一区二区精品仙草咪| 日韩免费视频一区| 精品久久人人做人人爱| 欧美大黄免费观看| 日韩三级在线观看| 日韩一区二区在线观看| 日韩视频免费观看高清完整版在线观看| 日本二三区不卡| 91视频免费播放| 色呦呦国产精品| 色婷婷国产精品| 欧美中文字幕一区| 欧美精品亚洲二区| 日韩一区二区在线看片| 日韩一区二区三区在线观看| 日韩一区二区影院| 精品国产乱码久久久久久闺蜜 | 国产成人亚洲精品青草天美| 精品夜夜嗨av一区二区三区| 精一区二区三区| 国产一区二区三区高清播放| 国产成人精品亚洲日本在线桃色| 国产伦精品一区二区三区免费| 精品影院一区二区久久久| 激情欧美一区二区三区在线观看| 国产一区二区伦理片| 成人免费福利片| 成人福利视频在线看| 色婷婷久久一区二区三区麻豆| 色爱区综合激月婷婷| 欧美日韩一区二区电影| 欧美一区二区三区日韩| 精品国产乱子伦一区| 欧美激情一区二区在线| 最新热久久免费视频| 天天av天天翘天天综合网色鬼国产 | 欧美性一二三区| 日韩欧美亚洲国产另类| 欧美激情一区二区三区在线| 亚洲精品五月天| 免费在线一区观看| 成人免费高清视频| 欧美性大战xxxxx久久久| 欧美成人免费网站| 国产精品不卡一区| 视频一区二区欧美| 国产不卡高清在线观看视频| 欧美日韩一区成人| 国产日韩欧美精品在线| 亚洲成a人片在线观看中文| 国产精品一级片在线观看| 91福利精品视频| 久久精品一区二区三区av | 亚洲大片在线观看| 国产在线播放一区三区四| 91蝌蚪国产九色| 2022国产精品视频| 亚洲国产sm捆绑调教视频 | 国产99久久久国产精品免费看| 97久久精品人人做人人爽 | 国产成人av电影在线播放| 欧美日韩中文字幕精品| 久久久久高清精品| 亚洲国产精品一区二区久久恐怖片| 国产在线国偷精品产拍免费yy| 91女人视频在线观看| 精品国产区一区| 亚洲成av人片一区二区梦乃| 成人av网站免费| 欧美大片拔萝卜| 亚洲国产日韩综合久久精品| 成人精品小蝌蚪| 精品国产免费人成在线观看| 亚洲国产sm捆绑调教视频| www.性欧美| 国产日本欧洲亚洲| 另类的小说在线视频另类成人小视频在线 | 亚洲视频在线一区| 国产在线一区观看| 欧美一二三在线| 香蕉影视欧美成人| 欧美图片一区二区三区| 亚洲欧美日韩在线不卡| 国产盗摄精品一区二区三区在线 | 欧美剧在线免费观看网站 | a级高清视频欧美日韩| 精品噜噜噜噜久久久久久久久试看 | 亚洲日本护士毛茸茸| 国产麻豆精品95视频| 久久综合九色综合97婷婷女人| 蜜桃精品在线观看| 91精品婷婷国产综合久久竹菊| 亚洲一区在线免费观看| 91视频.com| 亚洲精品久久7777| 色婷婷久久久久swag精品| 亚洲欧美在线另类| av亚洲精华国产精华精| 亚洲日本一区二区| 欧美怡红院视频| 午夜欧美2019年伦理| 欧美高清www午色夜在线视频| 亚洲国产精品精华液网站| 欧美日韩综合在线| 偷窥少妇高潮呻吟av久久免费| 欧美精品色综合| 免费观看在线色综合| 日韩精品一区二区三区swag | 欧美日韩和欧美的一区二区| 亚洲午夜免费视频| 欧美日韩国产美女| 人人超碰91尤物精品国产| 欧美一区二区三区免费在线看| 免费成人av在线播放| 2019国产精品| 国产宾馆实践打屁股91| 亚洲男人天堂av网| 欧美另类高清zo欧美| 青青草国产精品亚洲专区无| 2023国产一二三区日本精品2022| 国产不卡免费视频| 亚洲黄色免费网站| 日韩一级黄色大片| 国产精品1024| 国产精品成人午夜| 欧美日韩小视频| 国产精品自拍av| 洋洋av久久久久久久一区| 欧美一级一区二区| 成人免费黄色在线| 日韩国产精品久久|