亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來(lái)到蟲(chóng)蟲(chóng)下載站! | ?? 資源下載 ?? 資源專輯 ?? 關(guān)于我們
? 蟲(chóng)蟲(chóng)下載站

?? robotsexclusionpolicy.html

?? 網(wǎng)絡(luò)爬蟲(chóng)開(kāi)源代碼
?? HTML
?? 第 1 頁(yè) / 共 2 頁(yè)
字號(hào):
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"><html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en"><head><meta http-equiv="content-type" content="text/html; charset=UTF-8" /><title>RobotsExclusionPolicy xref</title><link type="text/css" rel="stylesheet" href="../../../../stylesheet.css" /></head><body><div id="overview"><a href="../../../../../apidocs/org/archive/crawler/datamodel/RobotsExclusionPolicy.html">View Javadoc</a></div><pre><a name="1" href="#1">1</a>   <em class="comment">/*<em class="comment"> Copyright (C) 2003 Internet Archive.</em></em><a name="2" href="#2">2</a>   <em class="comment"> *</em><a name="3" href="#3">3</a>   <em class="comment"> * This file is part of the Heritrix web crawler (crawler.archive.org).</em><a name="4" href="#4">4</a>   <em class="comment"> *</em><a name="5" href="#5">5</a>   <em class="comment"> * Heritrix is free software; you can redistribute it and/or modify</em><a name="6" href="#6">6</a>   <em class="comment"> * it under the terms of the GNU Lesser Public License as published by</em><a name="7" href="#7">7</a>   <em class="comment"> * the Free Software Foundation; either version 2.1 of the License, or</em><a name="8" href="#8">8</a>   <em class="comment"> * any later version.</em><a name="9" href="#9">9</a>   <em class="comment"> *</em><a name="10" href="#10">10</a>  <em class="comment"> * Heritrix is distributed in the hope that it will be useful,</em><a name="11" href="#11">11</a>  <em class="comment"> * but WITHOUT ANY WARRANTY; without even the implied warranty of</em><a name="12" href="#12">12</a>  <em class="comment"> * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the</em><a name="13" href="#13">13</a>  <em class="comment"> * GNU Lesser Public License for more details.</em><a name="14" href="#14">14</a>  <em class="comment"> *</em><a name="15" href="#15">15</a>  <em class="comment"> * You should have received a copy of the GNU Lesser Public License</em><a name="16" href="#16">16</a>  <em class="comment"> * along with Heritrix; if not, write to the Free Software</em><a name="17" href="#17">17</a>  <em class="comment"> * Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA  02111-1307  USA</em><a name="18" href="#18">18</a>  <em class="comment"> *</em><a name="19" href="#19">19</a>  <em class="comment"> * RobotsExclusionPolicy.java</em><a name="20" href="#20">20</a>  <em class="comment"> * Created on Apr 17, 2003</em><a name="21" href="#21">21</a>  <em class="comment"> *</em><a name="22" href="#22">22</a>  <em class="comment"> * $Header$</em><a name="23" href="#23">23</a>  <em class="comment"> */</em><a name="24" href="#24">24</a>  <strong>package</strong> <a href="../../../../org/archive/crawler/datamodel/package-summary.html">org.archive.crawler.datamodel</a>;<a name="25" href="#25">25</a>  <a name="26" href="#26">26</a>  <strong>import</strong> java.io.BufferedReader;<a name="27" href="#27">27</a>  <strong>import</strong> java.io.IOException;<a name="28" href="#28">28</a>  <strong>import</strong> java.io.ObjectInputStream;<a name="29" href="#29">29</a>  <strong>import</strong> java.io.ObjectOutputStream;<a name="30" href="#30">30</a>  <strong>import</strong> java.io.Serializable;<a name="31" href="#31">31</a>  <strong>import</strong> java.util.ArrayList;<a name="32" href="#32">32</a>  <strong>import</strong> java.util.HashMap;<a name="33" href="#33">33</a>  <strong>import</strong> java.util.Iterator;<a name="34" href="#34">34</a>  <strong>import</strong> java.util.LinkedList;<a name="35" href="#35">35</a>  <strong>import</strong> java.util.List;<a name="36" href="#36">36</a>  <strong>import</strong> java.util.logging.Level;<a name="37" href="#37">37</a>  <strong>import</strong> java.util.logging.Logger;<a name="38" href="#38">38</a>  <a name="39" href="#39">39</a>  <strong>import</strong> org.apache.commons.httpclient.URIException;<a name="40" href="#40">40</a>  <strong>import</strong> org.archive.crawler.settings.CrawlerSettings;<a name="41" href="#41">41</a>  <a name="42" href="#42">42</a>  <em>/**<em>*</em></em><a name="43" href="#43">43</a>  <em> * RobotsExclusionPolicy represents the actual policy adopted with </em><a name="44" href="#44">44</a>  <em> * respect to a specific remote server, usually constructed from </em><a name="45" href="#45">45</a>  <em> * consulting the robots.txt, if any, the server provided. </em><a name="46" href="#46">46</a>  <em> * </em><a name="47" href="#47">47</a>  <em> * (The similarly named RobotsHonoringPolicy, on the other hand, </em><a name="48" href="#48">48</a>  <em> * describes the strategy used by the crawler to determine to what</em><a name="49" href="#49">49</a>  <em> * extent it respects exclusion rules.)</em><a name="50" href="#50">50</a>  <em> * </em><a name="51" href="#51">51</a>  <em> * The expiration of policies after a suitable amount of time has</em><a name="52" href="#52">52</a>  <em> * elapsed since last fetch is handled outside this class, in </em><a name="53" href="#53">53</a>  <em> * CrawlServer itself. </em><a name="54" href="#54">54</a>  <em> * </em><a name="55" href="#55">55</a>  <em> * @author gojomo</em><a name="56" href="#56">56</a>  <em> *</em><a name="57" href="#57">57</a>  <em> */</em><a name="58" href="#58">58</a>  <strong>public</strong> <strong>class</strong> <a href="../../../../org/archive/crawler/datamodel/RobotsExclusionPolicy.html">RobotsExclusionPolicy</a> implements Serializable {<a name="59" href="#59">59</a>  <a name="60" href="#60">60</a>      <strong>private</strong> <strong>static</strong> <strong>final</strong> <strong>long</strong> serialVersionUID = 6323907991237383113L;<a name="61" href="#61">61</a>  <a name="62" href="#62">62</a>      <strong>private</strong> <strong>static</strong> <strong>final</strong> Logger logger =<a name="63" href="#63">63</a>          Logger.getLogger(RobotsExclusionPolicy.<strong>class</strong>.getName());<a name="64" href="#64">64</a>  <a name="65" href="#65">65</a>      <strong>private</strong> <strong>final</strong> <strong>static</strong> <strong>int</strong> NORMAL_TYPE = 0;<a name="66" href="#66">66</a>      <strong>private</strong> <strong>final</strong> <strong>static</strong> <strong>int</strong> ALLOWALL_TYPE = 1;<a name="67" href="#67">67</a>      <strong>private</strong> <strong>final</strong> <strong>static</strong> <strong>int</strong> DENYALL_TYPE = 2;<a name="68" href="#68">68</a>      <strong>private</strong> <strong>transient</strong> <strong>int</strong> type = NORMAL_TYPE;<a name="69" href="#69">69</a>  <a name="70" href="#70">70</a>      <strong>public</strong> <strong>static</strong> <a href="../../../../org/archive/crawler/datamodel/RobotsExclusionPolicy.html">RobotsExclusionPolicy</a> ALLOWALL =<a name="71" href="#71">71</a>          <strong>new</strong> <a href="../../../../org/archive/crawler/datamodel/RobotsExclusionPolicy.html">RobotsExclusionPolicy</a>(ALLOWALL_TYPE);<a name="72" href="#72">72</a>      <strong>public</strong> <strong>static</strong> <a href="../../../../org/archive/crawler/datamodel/RobotsExclusionPolicy.html">RobotsExclusionPolicy</a> DENYALL =<a name="73" href="#73">73</a>          <strong>new</strong> <a href="../../../../org/archive/crawler/datamodel/RobotsExclusionPolicy.html">RobotsExclusionPolicy</a>(DENYALL_TYPE);<a name="74" href="#74">74</a>  <a name="75" href="#75">75</a>      <strong>private</strong> LinkedList&lt;String> userAgents = <strong>null</strong>;<a name="76" href="#76">76</a>      <strong>private</strong> HashMap&lt;String,List&lt;String>> disallows = <strong>null</strong>;<a name="77" href="#77">77</a>      <strong>transient</strong> <a href="../../../../org/archive/crawler/datamodel/RobotsHonoringPolicy.html">RobotsHonoringPolicy</a> honoringPolicy = <strong>null</strong>;<a name="78" href="#78">78</a>  <a name="79" href="#79">79</a>      <strong>private</strong> String lastUsedUserAgent = <strong>null</strong>;<a name="80" href="#80">80</a>      <strong>private</strong> List&lt;String> userAgentsToTest = <strong>null</strong>;<a name="81" href="#81">81</a>  <a name="82" href="#82">82</a>      <em>/**<em>*</em></em><a name="83" href="#83">83</a>  <em>     * @param settings </em><a name="84" href="#84">84</a>  <em>     * @param reader</em><a name="85" href="#85">85</a>  <em>     * @param honoringPolicy</em><a name="86" href="#86">86</a>  <em>     * @return Robot exclusion policy.</em><a name="87" href="#87">87</a>  <em>     * @throws IOException</em><a name="88" href="#88">88</a>  <em>     */</em><a name="89" href="#89">89</a>      <strong>public</strong> <strong>static</strong> <a href="../../../../org/archive/crawler/datamodel/RobotsExclusionPolicy.html">RobotsExclusionPolicy</a> policyFor(<a href="../../../../org/archive/crawler/settings/CrawlerSettings.html">CrawlerSettings</a> settings,<a name="90" href="#90">90</a>              BufferedReader reader, <a href="../../../../org/archive/crawler/datamodel/RobotsHonoringPolicy.html">RobotsHonoringPolicy</a> honoringPolicy)<a name="91" href="#91">91</a>      throws IOException {<a name="92" href="#92">92</a>          LinkedList&lt;String> userAgents = <strong>new</strong> LinkedList&lt;String>();<a name="93" href="#93">93</a>          HashMap&lt;String,List&lt;String>> disallows<a name="94" href="#94">94</a>           = <strong>new</strong> HashMap&lt;String,List&lt;String>>();<a name="95" href="#95">95</a>          Robotstxt.parse(reader, userAgents, disallows);<a name="96" href="#96">96</a>          <strong>return</strong> (disallows.isEmpty())?<a name="97" href="#97">97</a>              ALLOWALL:<a name="98" href="#98">98</a>              <strong>new</strong> <a href="../../../../org/archive/crawler/datamodel/RobotsExclusionPolicy.html">RobotsExclusionPolicy</a>(settings, userAgents, disallows,<a name="99" href="#99">99</a>                  honoringPolicy);<a name="100" href="#100">100</a>     }<a name="101" href="#101">101</a> <a name="102" href="#102">102</a> <a name="103" href="#103">103</a> <a name="104" href="#104">104</a>     <em>/**<em>*</em></em><a name="105" href="#105">105</a> <em>     * @param settings </em><a name="106" href="#106">106</a> <em>     * @param u</em><a name="107" href="#107">107</a> <em>     * @param d</em><a name="108" href="#108">108</a> <em>     * @param honoringPolicy</em><a name="109" href="#109">109</a> <em>     */</em><a name="110" href="#110">110</a>     <strong>public</strong> <a href="../../../../org/archive/crawler/datamodel/RobotsExclusionPolicy.html">RobotsExclusionPolicy</a>(<a href="../../../../org/archive/crawler/settings/CrawlerSettings.html">CrawlerSettings</a> settings, LinkedList&lt;String> u,<a name="111" href="#111">111</a>             HashMap&lt;String,List&lt;String>> d, <a name="112" href="#112">112</a>             <a href="../../../../org/archive/crawler/datamodel/RobotsHonoringPolicy.html">RobotsHonoringPolicy</a> honoringPolicy) {<a name="113" href="#113">113</a>         userAgents = u;<a name="114" href="#114">114</a>         disallows = d;<a name="115" href="#115">115</a>         <strong>this</strong>.honoringPolicy = honoringPolicy;<a name="116" href="#116">116</a> <a name="117" href="#117">117</a>         <strong>if</strong>(honoringPolicy == <strong>null</strong>) <strong>return</strong>;<a name="118" href="#118">118</a> <a name="119" href="#119">119</a>         <em class="comment">// If honoring policy is most favored user agent, all rules should be checked</em><a name="120" href="#120">120</a>         <strong>if</strong>(honoringPolicy.isType(settings, RobotsHonoringPolicy.MOST_FAVORED)) {<a name="121" href="#121">121</a>             userAgentsToTest = userAgents;<a name="122" href="#122">122</a> <a name="123" href="#123">123</a>         <em class="comment">// IF honoring policy is most favored of set, then make a list with only the set as members</em><a name="124" href="#124">124</a>         } <strong>else</strong> <strong>if</strong>(honoringPolicy.isType(settings, RobotsHonoringPolicy.MOST_FAVORED_SET)) {<a name="125" href="#125">125</a>             userAgentsToTest = <strong>new</strong> ArrayList&lt;String>();<a name="126" href="#126">126</a>             Iterator userAgentSet = honoringPolicy.getUserAgents(settings).iterator();<a name="127" href="#127">127</a>             <strong>while</strong>(userAgentSet.hasNext()) {<a name="128" href="#128">128</a>                 String userAgent = (String) userAgentSet.next();<a name="129" href="#129">129</a> <a name="130" href="#130">130</a>                 Iterator iter = userAgents.iterator();

?? 快捷鍵說(shuō)明

復(fù)制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號(hào) Ctrl + =
減小字號(hào) Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
久久精品欧美日韩精品| 老司机精品视频线观看86 | 成人av电影在线观看| 亚洲猫色日本管| 91精品国产综合久久久蜜臀粉嫩| 国产凹凸在线观看一区二区| 日韩精品五月天| 亚洲女子a中天字幕| 日本一区二区三区电影| 日韩午夜电影av| 欧美午夜电影网| 91香蕉视频mp4| 国产成人a级片| 激情小说亚洲一区| 石原莉奈在线亚洲二区| 亚洲主播在线观看| 成人欧美一区二区三区黑人麻豆| 精品第一国产综合精品aⅴ| 欧美精品777| 欧美日本一区二区三区四区| 一本到三区不卡视频| 波多野结衣精品在线| 国产东北露脸精品视频| 狠狠色丁香久久婷婷综| 美女mm1313爽爽久久久蜜臀| 亚洲电影激情视频网站| 亚洲第一福利一区| 亚洲一区二区视频在线| 亚洲精品一卡二卡| 亚洲三级免费观看| 成人欧美一区二区三区视频网页| 国产精品高清亚洲| 国产精品美女久久久久久久久久久| 国产日韩欧美制服另类| 国产午夜精品久久久久久久| 2022国产精品视频| 久久久亚洲高清| 国产偷v国产偷v亚洲高清| 久久夜色精品国产欧美乱极品| 日韩一级视频免费观看在线| 日韩欧美国产综合在线一区二区三区| 欧美日韩综合不卡| 欧美精品日韩一区| 欧美一区日韩一区| 日韩欧美国产wwwww| 精品1区2区在线观看| 久久香蕉国产线看观看99| 久久亚洲一区二区三区明星换脸 | 国产欧美日韩不卡| 中文字幕在线不卡一区| 亚洲日本在线视频观看| 一区二区三区四区视频精品免费| 亚洲美女一区二区三区| 性久久久久久久久| 蜜桃一区二区三区在线观看| 精品亚洲porn| 盗摄精品av一区二区三区| 99精品视频一区| 欧美色图天堂网| 欧美精品在线一区二区| 26uuu欧美| 亚洲色图欧美偷拍| 日韩电影在线观看网站| 国产乱子伦视频一区二区三区| 成人久久视频在线观看| 91精品福利视频| 日韩天堂在线观看| 国产精品午夜在线| 亚洲成人在线免费| 国产美女一区二区| 91女厕偷拍女厕偷拍高清| 欧美欧美午夜aⅴ在线观看| 久久综合九色综合97_久久久| 国产精品不卡在线| 日本欧美一区二区三区乱码| 国产成人鲁色资源国产91色综| 91国产免费观看| 精品久久久久久最新网址| 亚洲少妇中出一区| 免费成人小视频| 91影院在线免费观看| 欧美一区二区二区| 亚洲视频一区二区免费在线观看 | 日韩高清不卡一区二区三区| 国产麻豆一精品一av一免费| 欧美中文字幕亚洲一区二区va在线| 日韩欧美国产精品一区| 亚洲欧美另类小说| 寂寞少妇一区二区三区| 在线观看欧美日本| 欧美激情艳妇裸体舞| 日韩不卡在线观看日韩不卡视频| 成人性生交大片| 精品毛片乱码1区2区3区| 亚洲自拍偷拍欧美| 丰满岳乱妇一区二区三区| 欧美高清精品3d| 亚洲天堂免费看| 国产乱人伦偷精品视频免下载| 欧美日韩精品免费观看视频| 国产精品国产三级国产三级人妇 | 国产一区二区精品久久99| 在线观看精品一区| 亚洲国产精品高清| 激情成人综合网| 欧美精品粉嫩高潮一区二区| 亚洲精品videosex极品| 国产福利一区二区三区视频在线| 69堂精品视频| 亚洲小少妇裸体bbw| www.综合网.com| 国产亚洲综合在线| 激情另类小说区图片区视频区| 欧美剧在线免费观看网站| 亚洲欧美国产高清| 北条麻妃一区二区三区| 国产亚洲一本大道中文在线| 精品一区二区三区视频| 91精品国产综合久久精品| 亚洲国产成人高清精品| 欧美亚一区二区| 夜夜精品视频一区二区| 色噜噜狠狠色综合欧洲selulu| 国产精品欧美久久久久无广告| 激情综合网天天干| xfplay精品久久| 精品系列免费在线观看| 精品国产乱码久久久久久老虎| 日本午夜精品视频在线观看 | 日韩一区精品视频| 欧美日韩国产不卡| 同产精品九九九| 欧美精品在线观看播放| 日本欧洲一区二区| 精品国产麻豆免费人成网站| 黄一区二区三区| 国产欧美日韩卡一| 99精品久久只有精品| 亚洲少妇最新在线视频| 欧美午夜精品免费| 丝袜美腿一区二区三区| 欧美一区二区三区日韩| 麻豆成人综合网| 久久久精品欧美丰满| 丁香桃色午夜亚洲一区二区三区| 欧美国产日产图区| 99re成人精品视频| 亚洲第一av色| 精品国产成人系列| 粉嫩在线一区二区三区视频| 亚洲手机成人高清视频| 欧美性大战xxxxx久久久| 亚洲成人免费电影| 精品国产乱码久久久久久图片| 国产精品1区二区.| 有码一区二区三区| 欧美成人精品1314www| 国产成人啪免费观看软件 | 奇米影视在线99精品| 日韩一区二区三区视频| 国产精品一区二区三区99| 亚洲日本韩国一区| 欧美性大战久久久久久久蜜臀| 麻豆一区二区三区| 国产精品色眯眯| 欧美日韩日日夜夜| 国产激情一区二区三区四区| 一区二区三区在线观看欧美| 欧美一级搡bbbb搡bbbb| 国产精品一级片在线观看| 亚洲自拍都市欧美小说| 2021久久国产精品不只是精品| 91在线丨porny丨国产| 日韩国产一二三区| 国产精品视频一二| 欧美精品乱人伦久久久久久| 国产一区二区三区日韩| 亚洲精品国产品国语在线app| 日韩欧美色综合| 色婷婷精品大视频在线蜜桃视频| 蜜桃在线一区二区三区| 亚洲bt欧美bt精品777| 久久蜜桃av一区精品变态类天堂 | 欧美96一区二区免费视频| 26uuu久久天堂性欧美| 日本韩国精品在线| 国产一本一道久久香蕉| 有坂深雪av一区二区精品| 色婷婷综合久久久中文字幕| 午夜视黄欧洲亚洲| 国产拍揄自揄精品视频麻豆| 欧美久久久一区| aaa欧美色吧激情视频| 久久99精品国产麻豆婷婷| 亚洲精品一二三| 国产精品区一区二区三| 欧美成人一区二区| 欧美亚洲国产一区二区三区 | 久久精品av麻豆的观看方式| 亚洲国产日韩一级|