亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? recoveryjournal.html

?? 網絡爬蟲開源代碼
?? HTML
?? 第 1 頁 / 共 3 頁
字號:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"><html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en"><head><meta http-equiv="content-type" content="text/html; charset=UTF-8" /><title>RecoveryJournal xref</title><link type="text/css" rel="stylesheet" href="../../../../stylesheet.css" /></head><body><div id="overview"><a href="../../../../../apidocs/org/archive/crawler/frontier/RecoveryJournal.html">View Javadoc</a></div><pre><a name="1" href="#1">1</a>   <em class="comment">/*<em class="comment"> RecoveryJournal</em></em><a name="2" href="#2">2</a>   <em class="comment"> *</em><a name="3" href="#3">3</a>   <em class="comment"> * $Id: RecoveryJournal.java 4969 2007-03-08 18:57:41Z gojomo $</em><a name="4" href="#4">4</a>   <em class="comment"> *</em><a name="5" href="#5">5</a>   <em class="comment"> * Created on Jul 20, 2004</em><a name="6" href="#6">6</a>   <em class="comment"> *</em><a name="7" href="#7">7</a>   <em class="comment"> * Copyright (C) 2004 Internet Archive.</em><a name="8" href="#8">8</a>   <em class="comment"> *</em><a name="9" href="#9">9</a>   <em class="comment"> * This file is part of the Heritrix web crawler (crawler.archive.org).</em><a name="10" href="#10">10</a>  <em class="comment"> *</em><a name="11" href="#11">11</a>  <em class="comment"> * Heritrix is free software; you can redistribute it and/or modify</em><a name="12" href="#12">12</a>  <em class="comment"> * it under the terms of the GNU Lesser Public License as published by</em><a name="13" href="#13">13</a>  <em class="comment"> * the Free Software Foundation; either version 2.1 of the License, or</em><a name="14" href="#14">14</a>  <em class="comment"> * any later version.</em><a name="15" href="#15">15</a>  <em class="comment"> *</em><a name="16" href="#16">16</a>  <em class="comment"> * Heritrix is distributed in the hope that it will be useful,</em><a name="17" href="#17">17</a>  <em class="comment"> * but WITHOUT ANY WARRANTY; without even the implied warranty of</em><a name="18" href="#18">18</a>  <em class="comment"> * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the</em><a name="19" href="#19">19</a>  <em class="comment"> * GNU Lesser Public License for more details.</em><a name="20" href="#20">20</a>  <em class="comment"> *</em><a name="21" href="#21">21</a>  <em class="comment"> * You should have received a copy of the GNU Lesser Public License</em><a name="22" href="#22">22</a>  <em class="comment"> * along with Heritrix; if not, write to the Free Software</em><a name="23" href="#23">23</a>  <em class="comment"> * Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA  02111-1307  USA</em><a name="24" href="#24">24</a>  <em class="comment"> */</em><a name="25" href="#25">25</a>  <strong>package</strong> <a href="../../../../org/archive/crawler/frontier/package-summary.html">org.archive.crawler.frontier</a>;<a name="26" href="#26">26</a>  <a name="27" href="#27">27</a>  <strong>import</strong> it.unimi.dsi.mg4j.util.MutableString;<a name="28" href="#28">28</a>  <a name="29" href="#29">29</a>  <strong>import</strong> java.io.BufferedInputStream;<a name="30" href="#30">30</a>  <strong>import</strong> java.io.EOFException;<a name="31" href="#31">31</a>  <strong>import</strong> java.io.File;<a name="32" href="#32">32</a>  <strong>import</strong> java.io.IOException;<a name="33" href="#33">33</a>  <strong>import</strong> java.util.ArrayList;<a name="34" href="#34">34</a>  <strong>import</strong> java.util.logging.Logger;<a name="35" href="#35">35</a>  <a name="36" href="#36">36</a>  <strong>import</strong> org.apache.commons.httpclient.URIException;<a name="37" href="#37">37</a>  <strong>import</strong> org.archive.crawler.datamodel.CandidateURI;<a name="38" href="#38">38</a>  <strong>import</strong> org.archive.crawler.datamodel.CrawlURI;<a name="39" href="#39">39</a>  <strong>import</strong> org.archive.crawler.framework.Frontier;<a name="40" href="#40">40</a>  <strong>import</strong> org.archive.crawler.io.CrawlerJournal;<a name="41" href="#41">41</a>  <strong>import</strong> org.archive.net.UURI;<a name="42" href="#42">42</a>  <strong>import</strong> org.archive.net.UURIFactory;<a name="43" href="#43">43</a>  <a name="44" href="#44">44</a>  <strong>import</strong> java.util.concurrent.CountDownLatch;<a name="45" href="#45">45</a>  <a name="46" href="#46">46</a>  <em>/**<em>*</em></em><a name="47" href="#47">47</a>  <em> * Helper class for managing a simple Frontier change-events journal which is</em><a name="48" href="#48">48</a>  <em> * useful for recovering from crawl problems.</em><a name="49" href="#49">49</a>  <em> * </em><a name="50" href="#50">50</a>  <em> * By replaying the journal into a new Frontier, its state (at least with</em><a name="51" href="#51">51</a>  <em> * respect to URIs alreadyIncluded and in pending queues) will match that of the</em><a name="52" href="#52">52</a>  <em> * original Frontier, allowing a pseudo-resume of a previous crawl, at least as</em><a name="53" href="#53">53</a>  <em> * far as URI visitation/coverage is concerned.</em><a name="54" href="#54">54</a>  <em> * </em><a name="55" href="#55">55</a>  <em> * @author gojomo</em><a name="56" href="#56">56</a>  <em> */</em><a name="57" href="#57">57</a>  <strong>public</strong> <strong>class</strong> <a href="../../../../org/archive/crawler/frontier/RecoveryJournal.html">RecoveryJournal</a> <strong>extends</strong> <a href="../../../../org/archive/crawler/io/CrawlerJournal.html">CrawlerJournal</a> <a name="58" href="#58">58</a>  implements <a href="../../../../org/archive/crawler/frontier/FrontierJournal.html">FrontierJournal</a> {<a name="59" href="#59">59</a>      <strong>private</strong> <strong>static</strong> <strong>final</strong> Logger LOGGER = Logger.getLogger(<a name="60" href="#60">60</a>              RecoveryJournal.<strong>class</strong>.getName());<a name="61" href="#61">61</a>      <a name="62" href="#62">62</a>      <strong>public</strong> <strong>final</strong> <strong>static</strong> String F_ADD = <span class="string">"F+ "</span>;<a name="63" href="#63">63</a>      <strong>public</strong> <strong>final</strong> <strong>static</strong> String F_EMIT = <span class="string">"Fe "</span>;<a name="64" href="#64">64</a>      <strong>public</strong> <strong>final</strong> <strong>static</strong> String F_RESCHEDULE = <span class="string">"Fr "</span>;<a name="65" href="#65">65</a>      <strong>public</strong> <strong>final</strong> <strong>static</strong> String F_SUCCESS = <span class="string">"Fs "</span>;<a name="66" href="#66">66</a>      <strong>public</strong> <strong>final</strong> <strong>static</strong> String F_FAILURE = <span class="string">"Ff "</span>;<a name="67" href="#67">67</a>      <a name="68" href="#68">68</a>      <em class="comment">//  show recovery progress every this many lines</em><a name="69" href="#69">69</a>      <strong>private</strong> <strong>static</strong> <strong>final</strong> <strong>int</strong> PROGRESS_INTERVAL = 1000000;<a name="70" href="#70">70</a>  <a name="71" href="#71">71</a>      <em class="comment">// once this many URIs are queued during recovery, allow </em><a name="72" href="#72">72</a>      <em class="comment">// crawl to begin, while enqueuing of other URIs from log</em><a name="73" href="#73">73</a>      <em class="comment">// continues in background</em><a name="74" href="#74">74</a>      <strong>private</strong> <strong>static</strong> <strong>final</strong> <strong>long</strong> ENOUGH_TO_START_CRAWLING = 100000;<a name="75" href="#75">75</a>      <a name="76" href="#76">76</a>      <em>/**<em>*</em></em><a name="77" href="#77">77</a>  <em>     * Create a new recovery journal at the given location</em><a name="78" href="#78">78</a>  <em>     * </em><a name="79" href="#79">79</a>  <em>     * @param path Directory to make the recovery  journal in.</em><a name="80" href="#80">80</a>  <em>     * @param filename Name to use for recovery journal file.</em><a name="81" href="#81">81</a>  <em>     * @throws IOException</em><a name="82" href="#82">82</a>  <em>     */</em><a name="83" href="#83">83</a>      <strong>public</strong> <a href="../../../../org/archive/crawler/frontier/RecoveryJournal.html">RecoveryJournal</a>(String path, String filename)<a name="84" href="#84">84</a>      throws IOException {<a name="85" href="#85">85</a>          <strong>super</strong>(path,filename);<a name="86" href="#86">86</a>          timestamp_interval = 10000; <em class="comment">// write timestamp lines occasionally</em><a name="87" href="#87">87</a>      }<a name="88" href="#88">88</a>      <a name="89" href="#89">89</a>      <strong>public</strong> <strong>synchronized</strong> <strong>void</strong> added(<a href="../../../../org/archive/crawler/datamodel/CrawlURI.html">CrawlURI</a> curi) {<a name="90" href="#90">90</a>          accumulatingBuffer.length(0);<a name="91" href="#91">91</a>          <strong>this</strong>.accumulatingBuffer.append(F_ADD).<a name="92" href="#92">92</a>              append(curi.toString()).<a name="93" href="#93">93</a>              append(<span class="string">" "</span>). <a name="94" href="#94">94</a>              append(curi.getPathFromSeed()).<a name="95" href="#95">95</a>              append(<span class="string">" "</span>).<a name="96" href="#96">96</a>              append(curi.flattenVia());<a name="97" href="#97">97</a>          writeLine(accumulatingBuffer);<a name="98" href="#98">98</a>      }<a name="99" href="#99">99</a>  <a name="100" href="#100">100</a>     <strong>public</strong> <strong>void</strong> finishedSuccess(<a href="../../../../org/archive/crawler/datamodel/CrawlURI.html">CrawlURI</a> curi) {<a name="101" href="#101">101</a>         finishedSuccess(curi.toString());<a name="102" href="#102">102</a>     }<a name="103" href="#103">103</a>     <a name="104" href="#104">104</a>     <strong>public</strong> <strong>void</strong> finishedSuccess(<a href="../../../../org/archive/net/UURI.html">UURI</a> uuri) {<a name="105" href="#105">105</a>         finishedSuccess(uuri.toString());<a name="106" href="#106">106</a>     }<a name="107" href="#107">107</a>     <a name="108" href="#108">108</a>     <strong>protected</strong> <strong>void</strong> finishedSuccess(String uuri) {<a name="109" href="#109">109</a>         writeLine(F_SUCCESS, uuri);<a name="110" href="#110">110</a>     }<a name="111" href="#111">111</a> <a name="112" href="#112">112</a>     <strong>public</strong> <strong>void</strong> emitted(<a href="../../../../org/archive/crawler/datamodel/CrawlURI.html">CrawlURI</a> curi) {<a name="113" href="#113">113</a>         writeLine(F_EMIT, curi.toString());<a name="114" href="#114">114</a> <a name="115" href="#115">115</a>     }<a name="116" href="#116">116</a> <a name="117" href="#117">117</a>     <strong>public</strong> <strong>void</strong> finishedFailure(<a href="../../../../org/archive/crawler/datamodel/CrawlURI.html">CrawlURI</a> curi) {<a name="118" href="#118">118</a>         finishedFailure(curi.toString());<a name="119" href="#119">119</a>     }<a name="120" href="#120">120</a>     

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
美国av一区二区| 亚洲欧美日韩国产综合| 中文字幕一区二区不卡| 青青草原综合久久大伊人精品| 色94色欧美sute亚洲线路一ni| 国产欧美精品在线观看| 视频一区中文字幕| 欧美日韩在线三级| 亚洲人亚洲人成电影网站色| 国产精品一区二区果冻传媒| 欧美日韩精品免费观看视频| 中文字幕精品三区| 开心九九激情九九欧美日韩精美视频电影| 欧美伊人久久久久久午夜久久久久| 久久国产精品99久久人人澡| 欧美探花视频资源| 国产三级精品三级在线专区| 青草国产精品久久久久久| 欧洲精品视频在线观看| 亚洲欧美日韩国产另类专区| 国内精品久久久久影院一蜜桃| 欧美在线播放高清精品| 综合激情成人伊人| 91亚洲大成网污www| 亚洲免费在线观看| 91麻豆swag| 亚洲女同一区二区| 欧美午夜精品电影| 日本不卡中文字幕| 久久久精品综合| 国产成人亚洲综合a∨婷婷图片| 欧美经典一区二区| 成人做爰69片免费看网站| 国产精品免费人成网站| 色综合久久88色综合天天免费| 国产三级久久久| 91麻豆产精品久久久久久| 中文字幕综合网| 色久综合一二码| 亚洲同性同志一二三专区| 国产成人av在线影院| 日韩欧美在线影院| 国产成人免费9x9x人网站视频| 久久久一区二区| 国产成人综合在线观看| 亚洲欧美日韩一区二区| 欧美日韩综合不卡| 美女脱光内衣内裤视频久久网站| 国产精品国产三级国产aⅴ中文 | 青青草原综合久久大伊人精品优势| 欧美一区二区三区免费观看视频| 午夜私人影院久久久久| 欧美国产日韩精品免费观看| av在线综合网| 日韩成人免费电影| 日韩一二三区视频| 韩国精品主播一区二区在线观看| 精品福利二区三区| 91在线观看一区二区| 依依成人综合视频| 日韩一区二区在线观看视频播放| 成人手机在线视频| 亚洲h精品动漫在线观看| 日韩欧美一区二区免费| 成人免费视频视频| 伦理电影国产精品| 亚洲va韩国va欧美va| 亚洲国产精品精华液ab| 欧美乱妇15p| 国产成人在线视频播放| 日韩高清在线观看| 亚洲女性喷水在线观看一区| 亚洲国产高清aⅴ视频| 欧美日韩免费一区二区三区| 成人一区二区三区在线观看| 日韩国产欧美一区二区三区| 国产色产综合色产在线视频| 日韩一区二区视频| 99天天综合性| 麻豆91在线播放免费| 一区二区在线观看视频在线观看| 国产欧美日韩在线观看| 日韩精品一区二区三区四区| 色综合婷婷久久| 激情综合五月天| 日韩电影一区二区三区四区| 伊人婷婷欧美激情| 国产亲近乱来精品视频| 亚洲一级二级三级在线免费观看| 国产午夜精品在线观看| 欧美成人精精品一区二区频| 91成人在线精品| 欧美三区在线观看| 色综合亚洲欧洲| 欧美日韩在线三区| 欧美日韩国产高清一区二区三区| 欧美日韩国产综合一区二区 | 国产日产亚洲精品系列| 国产日韩精品一区| 亚洲精品一卡二卡| 国产精品不卡在线观看| 国产精品欧美经典| 亚洲日本青草视频在线怡红院 | 洋洋av久久久久久久一区| 国产精品久久久久久久久免费樱桃 | 一区二区三区视频在线看| 亚洲成人精品影院| 亚洲国产毛片aaaaa无费看| 亚洲精品高清视频在线观看| 亚洲私人影院在线观看| 亚洲乱码国产乱码精品精小说 | 欧美乱妇23p| 亚洲欧美日韩国产综合| 亚洲视频一区二区在线观看| 亚洲一区二区在线观看视频| 亚洲va欧美va人人爽午夜| 亚洲电影一区二区| 日韩国产成人精品| 国产麻豆视频一区| 99视频有精品| 色哟哟在线观看一区二区三区| 69久久夜色精品国产69蝌蚪网| 欧美mv和日韩mv的网站| 亚洲国产精华液网站w| 国产亚洲成aⅴ人片在线观看| 国产精品女同互慰在线看 | 亚洲午夜成aⅴ人片| 亚洲小少妇裸体bbw| 一区二区三区中文免费| 亚洲精品国久久99热| 日韩中文欧美在线| 国产东北露脸精品视频| 在线看日本不卡| 精品国产乱码91久久久久久网站| 中文字幕电影一区| 日本麻豆一区二区三区视频| 国内精品国产三级国产a久久| 成人福利视频在线| 91精品国产福利| 国产精品电影一区二区| 亚洲va在线va天堂| 日韩高清一区在线| 国产 欧美在线| 一本大道av一区二区在线播放| 欧美精品在线观看播放| 日韩一区在线看| 免费成人av在线| 一本色道a无线码一区v| xf在线a精品一区二区视频网站| 亚洲精品免费看| 国产一区二区不卡老阿姨| 欧美精品v国产精品v日韩精品| www国产精品av| 亚洲一区在线观看免费| 国产精品夜夜嗨| 日韩丝袜美女视频| 成人综合婷婷国产精品久久蜜臀| 欧美性色欧美a在线播放| 久久―日本道色综合久久| 日本亚洲免费观看| 日本精品裸体写真集在线观看| 国产精品久久久久久户外露出| 久久精品久久精品| 91精品国产综合久久久蜜臀图片| 中文在线资源观看网站视频免费不卡 | 欧美激情一区二区| 亚洲一区视频在线| 国产精品888| 2021中文字幕一区亚洲| 蜜臀av一区二区在线免费观看| 91九色最新地址| 亚洲久本草在线中文字幕| 成人性生交大片免费| 国产午夜精品理论片a级大结局| 国产成人日日夜夜| 久久精品视频在线看| 蓝色福利精品导航| 欧美国产欧美综合| 国产成人日日夜夜| 国产精品欧美久久久久一区二区| 久久成人精品无人区| 欧美一区二区私人影院日本| 午夜影院久久久| 26uuu另类欧美亚洲曰本| 精品综合久久久久久8888| 欧美电影精品一区二区| 亚洲va欧美va国产va天堂影院| 日韩精品中午字幕| 国产精品影视网| 日韩毛片视频在线看| 欧美亚洲综合在线| 日韩激情av在线| 国产精品成人一区二区三区夜夜夜| 色狠狠桃花综合| 美女久久久精品| 精品国产伦一区二区三区观看方式 | 欧美亚洲愉拍一区二区| 亚洲成人综合网站| 精品国产成人在线影院| 一道本成人在线|