亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? crawluri.html

?? 網絡爬蟲開源代碼
?? HTML
?? 第 1 頁 / 共 5 頁
字號:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"><html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en"><head><meta http-equiv="content-type" content="text/html; charset=UTF-8" /><title>CrawlURI xref</title><link type="text/css" rel="stylesheet" href="../../../../stylesheet.css" /></head><body><div id="overview"><a href="../../../../../apidocs/org/archive/crawler/datamodel/CrawlURI.html">View Javadoc</a></div><pre><a name="1" href="#1">1</a>   <em class="comment">/*<em class="comment"> Copyright (C) 2003 Internet Archive.</em></em><a name="2" href="#2">2</a>   <em class="comment"> *</em><a name="3" href="#3">3</a>   <em class="comment"> * This file is part of the Heritrix web crawler (crawler.archive.org).</em><a name="4" href="#4">4</a>   <em class="comment"> *</em><a name="5" href="#5">5</a>   <em class="comment"> * Heritrix is free software; you can redistribute it and/or modify</em><a name="6" href="#6">6</a>   <em class="comment"> * it under the terms of the GNU Lesser Public License as published by</em><a name="7" href="#7">7</a>   <em class="comment"> * the Free Software Foundation; either version 2.1 of the License, or</em><a name="8" href="#8">8</a>   <em class="comment"> * any later version.</em><a name="9" href="#9">9</a>   <em class="comment"> *</em><a name="10" href="#10">10</a>  <em class="comment"> * Heritrix is distributed in the hope that it will be useful,</em><a name="11" href="#11">11</a>  <em class="comment"> * but WITHOUT ANY WARRANTY; without even the implied warranty of</em><a name="12" href="#12">12</a>  <em class="comment"> * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the</em><a name="13" href="#13">13</a>  <em class="comment"> * GNU Lesser Public License for more details.</em><a name="14" href="#14">14</a>  <em class="comment"> *</em><a name="15" href="#15">15</a>  <em class="comment"> * You should have received a copy of the GNU Lesser Public License</em><a name="16" href="#16">16</a>  <em class="comment"> * along with Heritrix; if not, write to the Free Software</em><a name="17" href="#17">17</a>  <em class="comment"> * Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA  02111-1307  USA</em><a name="18" href="#18">18</a>  <em class="comment"> *</em><a name="19" href="#19">19</a>  <em class="comment"> * CrawlURI.java</em><a name="20" href="#20">20</a>  <em class="comment"> * Created on Apr 16, 2003</em><a name="21" href="#21">21</a>  <em class="comment"> *</em><a name="22" href="#22">22</a>  <em class="comment"> * $Header$</em><a name="23" href="#23">23</a>  <em class="comment"> */</em><a name="24" href="#24">24</a>  <strong>package</strong> <a href="../../../../org/archive/crawler/datamodel/package-summary.html">org.archive.crawler.datamodel</a>;<a name="25" href="#25">25</a>  <a name="26" href="#26">26</a>  <strong>import</strong> java.io.IOException;<a name="27" href="#27">27</a>  <strong>import</strong> java.io.ObjectInputStream;<a name="28" href="#28">28</a>  <strong>import</strong> java.io.ObjectOutputStream;<a name="29" href="#29">29</a>  <strong>import</strong> java.util.ArrayList;<a name="30" href="#30">30</a>  <strong>import</strong> java.util.Collection;<a name="31" href="#31">31</a>  <strong>import</strong> java.util.HashSet;<a name="32" href="#32">32</a>  <strong>import</strong> java.util.Iterator;<a name="33" href="#33">33</a>  <strong>import</strong> java.util.List;<a name="34" href="#34">34</a>  <strong>import</strong> java.util.Set;<a name="35" href="#35">35</a>  <strong>import</strong> java.util.concurrent.CopyOnWriteArrayList;<a name="36" href="#36">36</a>  <a name="37" href="#37">37</a>  <strong>import</strong> org.apache.commons.httpclient.HttpStatus;<a name="38" href="#38">38</a>  <strong>import</strong> org.apache.commons.httpclient.URIException;<a name="39" href="#39">39</a>  <strong>import</strong> org.archive.crawler.datamodel.credential.CredentialAvatar;<a name="40" href="#40">40</a>  <strong>import</strong> org.archive.crawler.datamodel.credential.Rfc2617Credential;<a name="41" href="#41">41</a>  <strong>import</strong> org.archive.crawler.extractor.Link;<a name="42" href="#42">42</a>  <strong>import</strong> org.archive.crawler.framework.Processor;<a name="43" href="#43">43</a>  <strong>import</strong> org.archive.crawler.framework.ProcessorChain;<a name="44" href="#44">44</a>  <strong>import</strong> org.archive.crawler.util.Transform;<a name="45" href="#45">45</a>  <strong>import</strong> org.archive.net.UURI;<a name="46" href="#46">46</a>  <strong>import</strong> org.archive.net.UURIFactory;<a name="47" href="#47">47</a>  <strong>import</strong> org.archive.util.Base32;<a name="48" href="#48">48</a>  <strong>import</strong> org.archive.util.HttpRecorder;<a name="49" href="#49">49</a>  <a name="50" href="#50">50</a>  <strong>import</strong> st.ata.util.AList;<a name="51" href="#51">51</a>  <strong>import</strong> st.ata.util.HashtableAList;<a name="52" href="#52">52</a>  <a name="53" href="#53">53</a>  <a name="54" href="#54">54</a>  <em>/**<em>*</em></em><a name="55" href="#55">55</a>  <em> * Represents a candidate URI and the associated state it</em><a name="56" href="#56">56</a>  <em> * collects as it is crawled.</em><a name="57" href="#57">57</a>  <em> *</em><a name="58" href="#58">58</a>  <em> * &lt;p>Core state is in instance variables but a flexible</em><a name="59" href="#59">59</a>  <em> * attribute list is also available. Use this 'bucket' to carry</em><a name="60" href="#60">60</a>  <em> * custom processing extracted data and state across CrawlURI</em><a name="61" href="#61">61</a>  <em> * processing.  See the {@link #putString(String, String)},</em><a name="62" href="#62">62</a>  <em> * {@link #getString(String)}, etc. </em><a name="63" href="#63">63</a>  <em> *</em><a name="64" href="#64">64</a>  <em> * @author Gordon Mohr</em><a name="65" href="#65">65</a>  <em> */</em><a name="66" href="#66">66</a>  <strong>public</strong> <strong>class</strong> <a href="../../../../org/archive/crawler/datamodel/CrawlURI.html">CrawlURI</a> <strong>extends</strong> <a href="../../../../org/archive/crawler/datamodel/CandidateURI.html">CandidateURI</a><a name="67" href="#67">67</a>  implements <a href="../../../../org/archive/crawler/datamodel/FetchStatusCodes.html">FetchStatusCodes</a> {<a name="68" href="#68">68</a>  <a name="69" href="#69">69</a>      <strong>private</strong> <strong>static</strong> <strong>final</strong> <strong>long</strong> serialVersionUID = 7874096757350100472L;<a name="70" href="#70">70</a>  <a name="71" href="#71">71</a>      <strong>public</strong> <strong>static</strong> <strong>final</strong> <strong>int</strong> UNCALCULATED = -1;<a name="72" href="#72">72</a>      <a name="73" href="#73">73</a>      <em class="comment">// INHERITED FROM CANDIDATEURI</em><a name="74" href="#74">74</a>      <em class="comment">// uuri: core identity: the "usable URI" to be crawled</em><a name="75" href="#75">75</a>      <em class="comment">// isSeed</em><a name="76" href="#76">76</a>      <em class="comment">// inScopeVersion</em><a name="77" href="#77">77</a>      <em class="comment">// pathFromSeed</em><a name="78" href="#78">78</a>      <em class="comment">// via</em><a name="79" href="#79">79</a>  <a name="80" href="#80">80</a>      <em class="comment">// Processing progress</em><a name="81" href="#81">81</a>      <strong>transient</strong> <strong>private</strong> <a href="../../../../org/archive/crawler/framework/Processor.html">Processor</a> nextProcessor;<a name="82" href="#82">82</a>      <strong>transient</strong> <strong>private</strong> <a href="../../../../org/archive/crawler/framework/ProcessorChain.html">ProcessorChain</a> nextProcessorChain;<a name="83" href="#83">83</a>      <strong>private</strong> <strong>int</strong> fetchStatus = 0;    <em class="comment">// default to unattempted</em><a name="84" href="#84">84</a>      <strong>private</strong> <strong>int</strong> deferrals = 0;     <em class="comment">// count of postponements for prerequisites</em><a name="85" href="#85">85</a>      <strong>private</strong> <strong>int</strong> fetchAttempts = 0; <em class="comment">// the number of fetch attempts that have been made</em><a name="86" href="#86">86</a>      <strong>transient</strong> <strong>private</strong> <strong>int</strong> threadNumber;<a name="87" href="#87">87</a>  <a name="88" href="#88">88</a>      <em class="comment">// dynamic context</em><a name="89" href="#89">89</a>      <em>/**<em>* @deprecated */</em></em><a name="90" href="#90">90</a>      <strong>private</strong> <strong>int</strong> linkHopCount = UNCALCULATED; <em class="comment">// from seeds</em><a name="91" href="#91">91</a>      <em>/**<em>* @deprecated */</em></em><a name="92" href="#92">92</a>      <strong>private</strong> <strong>int</strong> embedHopCount = UNCALCULATED; <em class="comment">// from a sure link; reset upon any link traversal</em><a name="93" href="#93">93</a>  <a name="94" href="#94">94</a>      <em class="comment">// User agent to masquerade as when crawling this URI. If null, globals should be used</em><a name="95" href="#95">95</a>      <strong>private</strong> String userAgent = <strong>null</strong>;<a name="96" href="#96">96</a>  <a name="97" href="#97">97</a>      <em class="comment">// Once a link extractor has finished processing this curi this will be</em><a name="98" href="#98">98</a>      <em class="comment">// set as true</em><a name="99" href="#99">99</a>      <strong>transient</strong> <strong>private</strong> <strong>boolean</strong> linkExtractorFinished = false;<a name="100" href="#100">100</a> <a name="101" href="#101">101</a>     <em>/**<em>*</em></em><a name="102" href="#102">102</a> <em>     * Protection against outlink overflow.</em><a name="103" href="#103">103</a> <em>     * Change value by setting alternate maximum in heritrix.properties.</em><a name="104" href="#104">104</a> <em>     */</em><a name="105" href="#105">105</a>     <strong>public</strong> <strong>static</strong> <strong>final</strong> <strong>int</strong> MAX_OUTLINKS = Integer.<a name="106" href="#106">106</a>         parseInt(System.getProperty(CrawlURI.<strong>class</strong>.getName() + <span class="string">".maxOutLinks"</span>,<a name="107" href="#107">107</a>             <span class="string">"6000"</span>));<a name="108" href="#108">108</a>     <a name="109" href="#109">109</a>     <strong>transient</strong> <strong>private</strong> <strong>int</strong> discardedOutlinks = 0; <a name="110" href="#110">110</a>     <a name="111" href="#111">111</a> <em class="comment">////////////////////////////////////////////////////////////////////</em><a name="112" href="#112">112</a>     <strong>private</strong> <strong>long</strong> contentSize = UNCALCULATED;<a name="113" href="#113">113</a>     <strong>private</strong> <strong>long</strong> contentLength = UNCALCULATED;<a name="114" href="#114">114</a> <a name="115" href="#115">115</a>     <em>/**<em>*</em></em><a name="116" href="#116">116</a> <em>     * Current http recorder.</em><a name="117" href="#117">117</a> <em>     *</em><a name="118" href="#118">118</a> <em>     * Gets set upon successful request.  Reset at start of processing chain.</em><a name="119" href="#119">119</a> <em>     */</em><a name="120" href="#120">120</a>     <strong>private</strong> <strong>transient</strong> <a href="../../../../org/archive/util/HttpRecorder.html">HttpRecorder</a> httpRecorder = <strong>null</strong>;<a name="121" href="#121">121</a> <a name="122" href="#122">122</a>     <em>/**<em>*</em></em><a name="123" href="#123">123</a> <em>     * Content type of a successfully fetched URI.</em><a name="124" href="#124">124</a> <em>     *</em><a name="125" href="#125">125</a> <em>     * May be null even on successfully fetched URI.</em><a name="126" href="#126">126</a> <em>     */</em><a name="127" href="#127">127</a>     <strong>private</strong> String contentType = <strong>null</strong>;<a name="128" href="#128">128</a> <a name="129" href="#129">129</a>     <em>/**<em>*</em></em><a name="130" href="#130">130</a> <em>     * True if this CrawlURI has been deemed a prerequisite by the</em><a name="131" href="#131">131</a> <em>     * {@link org.archive.crawler.prefetch.PreconditionEnforcer}.</em><a name="132" href="#132">132</a> <em>     *</em><a name="133" href="#133">133</a> <em>     * This flag is used at least inside in the precondition enforcer so that</em><a name="134" href="#134">134</a> <em>     * subsequent prerequisite tests know to let this CrawlURI through because</em><a name="135" href="#135">135</a> <em>     * its a prerequisite needed by an earlier prerequisite tests (e.g. If</em><a name="136" href="#136">136</a> <em>     * this is a robots.txt, then the subsequent login credentials prereq</em><a name="137" href="#137">137</a> <em>     * test must not throw it out because its not a login curi).</em><a name="138" href="#138">138</a> <em>     */</em><a name="139" href="#139">139</a>     <strong>private</strong> <strong>boolean</strong> prerequisite = false;<a name="140" href="#140">140</a> <a name="141" href="#141">141</a>     <em>/**<em>*</em></em><a name="142" href="#142">142</a> <em>     * Set to true if this &lt;code>curi&lt;/code> is to be POST'd rather than GET-d.</em>

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
91在线国产观看| 亚洲国产sm捆绑调教视频 | 成a人片亚洲日本久久| 久久精品亚洲精品国产欧美kt∨| 久草在线在线精品观看| 久久午夜电影网| av在线播放一区二区三区| 亚洲欧美自拍偷拍色图| 欧美伊人久久久久久久久影院 | 在线观看网站黄不卡| 亚洲午夜精品一区二区三区他趣| 欧美日韩欧美一区二区| 久久国产尿小便嘘嘘尿| 欧美精品一区二区三区很污很色的 | 欧美视频中文字幕| 免费成人美女在线观看.| 国产亚洲精品aa| 欧美在线观看一二区| 日本伊人色综合网| 国产午夜精品一区二区三区嫩草| 色av成人天堂桃色av| 蜜乳av一区二区| 中文字幕一区在线观看视频| 欧美日本在线视频| 国产精品一区二区三区网站| 亚洲视频你懂的| 欧美电影免费观看高清完整版| 不卡视频一二三| 久久av资源站| 国产精品久久久久国产精品日日 | 欧美日韩在线一区二区| 久久精品国产网站| 有坂深雪av一区二区精品| 日韩一区二区在线观看视频播放| 懂色av一区二区三区免费观看 | 精品一区免费av| 有坂深雪av一区二区精品| 亚洲精品在线一区二区| 91久久人澡人人添人人爽欧美| 另类小说综合欧美亚洲| 伊人夜夜躁av伊人久久| 久久嫩草精品久久久久| 欧美丝袜丝nylons| av在线不卡免费看| 久久99国产精品免费网站| 亚洲一级不卡视频| 欧美激情在线看| 26uuu色噜噜精品一区二区| 欧美中文字幕不卡| 成人国产电影网| 九九视频精品免费| 日韩综合在线视频| 亚洲欧美乱综合| 国产精品丝袜久久久久久app| 日韩欧美一二三区| 欧美日韩大陆一区二区| 色网综合在线观看| 91同城在线观看| 国产91在线看| 国产综合久久久久影院| 丝袜美腿高跟呻吟高潮一区| 一区二区三区欧美亚洲| 中文字幕在线不卡一区| 亚洲国产成人私人影院tom| 亚洲精品一区二区三区在线观看| 3atv一区二区三区| 欧美日本在线看| 欧美精品久久一区二区三区| 欧美中文一区二区三区| 欧美性色综合网| 欧美视频在线不卡| 欧美日韩国产经典色站一区二区三区| 色综合久久综合| 91蝌蚪porny| 色就色 综合激情| 欧美性色aⅴ视频一区日韩精品| 91一区二区三区在线观看| 91麻豆高清视频| 欧美亚洲禁片免费| 欧美日韩精品一区二区在线播放 | 日韩精品一区二区三区视频播放| 色8久久精品久久久久久蜜| 色屁屁一区二区| 欧美视频一区二区三区在线观看| 欧美在线啊v一区| 欧美日韩不卡一区| 欧美大胆人体bbbb| 久久精品水蜜桃av综合天堂| 国产欧美一区二区精品性色超碰| 日本一区二区三区视频视频| 国产精品看片你懂得| 亚洲精品乱码久久久久久久久| 一区二区三区精品久久久| 亚洲成av人片在线| 国产精品一卡二| 不卡影院免费观看| 欧美综合一区二区| 精品少妇一区二区三区 | 国产精品嫩草久久久久| 亚洲日穴在线视频| 天天亚洲美女在线视频| 国产综合久久久久影院| 成人av电影在线| 欧美亚洲国产一卡| 欧美成人午夜电影| 国产精品剧情在线亚洲| 亚洲综合色在线| 理论片日本一区| 成人99免费视频| 91精品国产91久久久久久一区二区| 精品久久久久久久久久久久久久久久久 | 国产九色精品成人porny| 99免费精品视频| 欧美一区二区三区视频免费| 国产偷国产偷精品高清尤物| 一区二区三区欧美久久| 精品午夜一区二区三区在线观看| 99久久综合精品| 日韩午夜精品电影| 亚洲欧美综合另类在线卡通| 亚洲成人7777| 国产·精品毛片| 538在线一区二区精品国产| 久久精品视频一区二区三区| 亚洲精品美腿丝袜| 国产成人在线视频网址| 欧美嫩在线观看| 国产精品久久毛片av大全日韩| 日韩精彩视频在线观看| 91亚洲精品久久久蜜桃网站| 日韩欧美高清在线| 亚洲精品水蜜桃| 国产精华液一区二区三区| 欧美日韩视频专区在线播放| 国产精品热久久久久夜色精品三区| 日本亚洲欧美天堂免费| 在线观看日韩国产| 国产精品三级av在线播放| 久久国产精品第一页| 91国偷自产一区二区开放时间| 日本一区二区三区在线观看| 激情综合网av| 91精品久久久久久久99蜜桃| 亚洲猫色日本管| 成人理论电影网| 欧美激情艳妇裸体舞| 蓝色福利精品导航| 欧美精品日日鲁夜夜添| 亚洲精品精品亚洲| 91亚洲国产成人精品一区二区三| 久久综合色综合88| 麻豆一区二区三区| 56国语精品自产拍在线观看| 亚洲国产中文字幕| 在线观看一区日韩| 亚洲精品免费视频| 91网上在线视频| 亚洲女人****多毛耸耸8| 91在线小视频| 亚洲精品国产a久久久久久| 91一区在线观看| 一区二区三区国产豹纹内裤在线| 97se亚洲国产综合自在线| 亚洲欧美影音先锋| 91视频在线观看免费| 亚洲人成电影网站色mp4| 91亚洲精华国产精华精华液| 亚洲欧美日韩在线| 欧美精品在线观看播放| 亚洲午夜在线电影| 欧美日韩免费在线视频| 五月天亚洲婷婷| 日韩一区二区电影| 久久丁香综合五月国产三级网站| 欧美一区二区二区| 韩国毛片一区二区三区| 欧美经典一区二区| av不卡在线播放| 亚洲免费观看高清完整版在线观看| 91小视频免费观看| 亚洲午夜久久久久| 日韩片之四级片| 国产经典欧美精品| 日韩一区在线免费观看| 欧美性视频一区二区三区| 日韩成人免费电影| 亚洲精品在线观看网站| jlzzjlzz国产精品久久| 一区二区三区精品久久久| 欧美一区二区久久久| 国产美女av一区二区三区| 国产精品国模大尺度视频| 色网站国产精品| 麻豆国产欧美日韩综合精品二区| 久久精品一区四区| 欧洲色大大久久| 精品一区二区影视| 亚洲女人的天堂| 日韩欧美国产一区二区在线播放 | 六月丁香综合在线视频|