亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? bdbmultipleworkqueues.html

?? 網絡爬蟲開源代碼
?? HTML
?? 第 1 頁 / 共 4 頁
字號:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"><html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en"><head><meta http-equiv="content-type" content="text/html; charset=UTF-8" /><title>BdbMultipleWorkQueues xref</title><link type="text/css" rel="stylesheet" href="../../../../stylesheet.css" /></head><body><div id="overview"><a href="../../../../../apidocs/org/archive/crawler/frontier/BdbMultipleWorkQueues.html">View Javadoc</a></div><pre><a name="1" href="#1">1</a>   <em class="comment">/*<em class="comment"> BdbMultipleWorkQueues</em></em><a name="2" href="#2">2</a>   <em class="comment"> * </em><a name="3" href="#3">3</a>   <em class="comment"> * Created on Dec 24, 2004</em><a name="4" href="#4">4</a>   <em class="comment"> *</em><a name="5" href="#5">5</a>   <em class="comment"> * Copyright (C) 2004 Internet Archive.</em><a name="6" href="#6">6</a>   <em class="comment"> * </em><a name="7" href="#7">7</a>   <em class="comment"> * This file is part of the Heritrix web crawler (crawler.archive.org).</em><a name="8" href="#8">8</a>   <em class="comment"> * </em><a name="9" href="#9">9</a>   <em class="comment"> * Heritrix is free software; you can redistribute it and/or modify</em><a name="10" href="#10">10</a>  <em class="comment"> * it under the terms of the GNU Lesser Public License as published by</em><a name="11" href="#11">11</a>  <em class="comment"> * the Free Software Foundation; either version 2.1 of the License, or</em><a name="12" href="#12">12</a>  <em class="comment"> * any later version.</em><a name="13" href="#13">13</a>  <em class="comment"> * </em><a name="14" href="#14">14</a>  <em class="comment"> * Heritrix is distributed in the hope that it will be useful, </em><a name="15" href="#15">15</a>  <em class="comment"> * but WITHOUT ANY WARRANTY; without even the implied warranty of</em><a name="16" href="#16">16</a>  <em class="comment"> * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the</em><a name="17" href="#17">17</a>  <em class="comment"> * GNU Lesser Public License for more details.</em><a name="18" href="#18">18</a>  <em class="comment"> * </em><a name="19" href="#19">19</a>  <em class="comment"> * You should have received a copy of the GNU Lesser Public License</em><a name="20" href="#20">20</a>  <em class="comment"> * along with Heritrix; if not, write to the Free Software</em><a name="21" href="#21">21</a>  <em class="comment"> * Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA  02111-1307  USA</em><a name="22" href="#22">22</a>  <em class="comment"> */</em><a name="23" href="#23">23</a>  <strong>package</strong> <a href="../../../../org/archive/crawler/frontier/package-summary.html">org.archive.crawler.frontier</a>;<a name="24" href="#24">24</a>  <a name="25" href="#25">25</a>  <strong>import</strong> java.io.UnsupportedEncodingException;<a name="26" href="#26">26</a>  <strong>import</strong> java.math.BigInteger;<a name="27" href="#27">27</a>  <strong>import</strong> java.util.ArrayList;<a name="28" href="#28">28</a>  <strong>import</strong> java.util.List;<a name="29" href="#29">29</a>  <strong>import</strong> java.util.logging.Level;<a name="30" href="#30">30</a>  <strong>import</strong> java.util.logging.Logger;<a name="31" href="#31">31</a>  <strong>import</strong> java.util.regex.Pattern;<a name="32" href="#32">32</a>  <a name="33" href="#33">33</a>  <strong>import</strong> org.archive.crawler.datamodel.CrawlURI;<a name="34" href="#34">34</a>  <strong>import</strong> org.archive.crawler.framework.FrontierMarker;<a name="35" href="#35">35</a>  <strong>import</strong> org.archive.util.ArchiveUtils;<a name="36" href="#36">36</a>  <a name="37" href="#37">37</a>  <strong>import</strong> com.sleepycat.bind.serial.StoredClassCatalog;<a name="38" href="#38">38</a>  <strong>import</strong> com.sleepycat.je.Cursor;<a name="39" href="#39">39</a>  <strong>import</strong> com.sleepycat.je.Database;<a name="40" href="#40">40</a>  <strong>import</strong> com.sleepycat.je.DatabaseConfig;<a name="41" href="#41">41</a>  <strong>import</strong> com.sleepycat.je.DatabaseEntry;<a name="42" href="#42">42</a>  <strong>import</strong> com.sleepycat.je.DatabaseException;<a name="43" href="#43">43</a>  <strong>import</strong> com.sleepycat.je.DatabaseNotFoundException;<a name="44" href="#44">44</a>  <strong>import</strong> com.sleepycat.je.Environment;<a name="45" href="#45">45</a>  <strong>import</strong> com.sleepycat.je.OperationStatus;<a name="46" href="#46">46</a>  <strong>import</strong> com.sleepycat.util.RuntimeExceptionWrapper;<a name="47" href="#47">47</a>  <a name="48" href="#48">48</a>  <a name="49" href="#49">49</a>  <em>/**<em>*</em></em><a name="50" href="#50">50</a>  <em> * A BerkeleyDB-database-backed structure for holding ordered</em><a name="51" href="#51">51</a>  <em> * groupings of CrawlURIs. Reading the groupings from specific</em><a name="52" href="#52">52</a>  <em> * per-grouping (per-classKey/per-Host) starting points allows</em><a name="53" href="#53">53</a>  <em> * this to act as a collection of independent queues. </em><a name="54" href="#54">54</a>  <em> * </em><a name="55" href="#55">55</a>  <em> * &lt;p>For how the bdb keys are made, see {@link #calculateInsertKey(CrawlURI)}.</em><a name="56" href="#56">56</a>  <em> * </em><a name="57" href="#57">57</a>  <em> * &lt;p>TODO: refactor, improve naming.</em><a name="58" href="#58">58</a>  <em> * </em><a name="59" href="#59">59</a>  <em> * @author gojomo</em><a name="60" href="#60">60</a>  <em> */</em><a name="61" href="#61">61</a>  <strong>public</strong> <strong>class</strong> <a href="../../../../org/archive/crawler/frontier/BdbMultipleWorkQueues.html">BdbMultipleWorkQueues</a> {<a name="62" href="#62">62</a>  	<strong>private</strong> <strong>static</strong> <strong>final</strong> <strong>long</strong> serialVersionUID = <a href="../../../../org/archive/util/ArchiveUtils.html">ArchiveUtils</a><a name="63" href="#63">63</a>      	.<strong>class</strong>nameBasedUID(BdbMultipleWorkQueues.<strong>class</strong>, 1);<a name="64" href="#64">64</a>  	<a name="65" href="#65">65</a>      <strong>private</strong> <strong>static</strong> <strong>final</strong> Logger LOGGER =<a name="66" href="#66">66</a>          Logger.getLogger(BdbMultipleWorkQueues.<strong>class</strong>.getName());<a name="67" href="#67">67</a>      <a name="68" href="#68">68</a>      <em>/**<em>* Database holding all pending URIs, grouped in virtual queues */</em></em><a name="69" href="#69">69</a>      <strong>private</strong> Database pendingUrisDB = <strong>null</strong>;<a name="70" href="#70">70</a>      <a name="71" href="#71">71</a>      <em>/**<em>*  Supporting bdb serialization of CrawlURIs */</em></em><a name="72" href="#72">72</a>      <strong>private</strong> <a href="../../../../org/archive/crawler/frontier/RecyclingSerialBinding.html">RecyclingSerialBinding</a> crawlUriBinding;<a name="73" href="#73">73</a>  <a name="74" href="#74">74</a>      <em>/**<em>*</em></em><a name="75" href="#75">75</a>  <em>     * Create the multi queue in the given environment. </em><a name="76" href="#76">76</a>  <em>     * </em><a name="77" href="#77">77</a>  <em>     * @param env bdb environment to use</em><a name="78" href="#78">78</a>  <em>     * @param classCatalog Class catalog to use.</em><a name="79" href="#79">79</a>  <em>     * @param recycle True if we are to reuse db content if any.</em><a name="80" href="#80">80</a>  <em>     * @throws DatabaseException</em><a name="81" href="#81">81</a>  <em>     */</em><a name="82" href="#82">82</a>      <strong>public</strong> <a href="../../../../org/archive/crawler/frontier/BdbMultipleWorkQueues.html">BdbMultipleWorkQueues</a>(Environment env,<a name="83" href="#83">83</a>          StoredClassCatalog classCatalog, <strong>final</strong> <strong>boolean</strong> recycle)<a name="84" href="#84">84</a>      throws DatabaseException {<a name="85" href="#85">85</a>          <em class="comment">// Open the database. Create it if it does not already exist. </em><a name="86" href="#86">86</a>          DatabaseConfig dbConfig = <strong>new</strong> DatabaseConfig();<a name="87" href="#87">87</a>          dbConfig.setAllowCreate(<strong>true</strong>);<a name="88" href="#88">88</a>          <strong>if</strong> (!recycle) {<a name="89" href="#89">89</a>              <strong>try</strong> {<a name="90" href="#90">90</a>                  env.truncateDatabase(<strong>null</strong>, <span class="string">"pending"</span>, false);<a name="91" href="#91">91</a>              } <strong>catch</strong> (DatabaseNotFoundException e) {<a name="92" href="#92">92</a>                  <em class="comment">// Ignored</em><a name="93" href="#93">93</a>              }<a name="94" href="#94">94</a>          }<a name="95" href="#95">95</a>          <em class="comment">// Make database deferred write: URLs that are added then removed </em><a name="96" href="#96">96</a>          <em class="comment">// before a page-out is required need never cause disk IO.</em><a name="97" href="#97">97</a>          dbConfig.setDeferredWrite(<strong>true</strong>);<a name="98" href="#98">98</a>  <a name="99" href="#99">99</a>          <strong>this</strong>.pendingUrisDB = env.openDatabase(<strong>null</strong>, <span class="string">"pending"</span>, dbConfig);<a name="100" href="#100">100</a>         crawlUriBinding =<a name="101" href="#101">101</a>             <strong>new</strong> <a href="../../../../org/archive/crawler/frontier/RecyclingSerialBinding.html">RecyclingSerialBinding</a>(<strong>class</strong>Catalog, CrawlURI.<strong>class</strong>);<a name="102" href="#102">102</a>     }<a name="103" href="#103">103</a> <a name="104" href="#104">104</a>     <em>/**<em>*</em></em><a name="105" href="#105">105</a> <em>     * Delete all CrawlURIs matching the given expression.</em><a name="106" href="#106">106</a> <em>     * </em><a name="107" href="#107">107</a> <em>     * @param match</em><a name="108" href="#108">108</a> <em>     * @param queue</em><a name="109" href="#109">109</a> <em>     * @param headKey</em><a name="110" href="#110">110</a> <em>     * @return count of deleted items</em><a name="111" href="#111">111</a> <em>     * @throws DatabaseException</em><a name="112" href="#112">112</a> <em>     * @throws DatabaseException</em><a name="113" href="#113">113</a> <em>     */</em><a name="114" href="#114">114</a>     <strong>public</strong> <strong>long</strong> deleteMatchingFromQueue(String match, String queue,<a name="115" href="#115">115</a>             DatabaseEntry headKey) throws DatabaseException {<a name="116" href="#116">116</a>         <strong>long</strong> deletedCount = 0;<a name="117" href="#117">117</a>         Pattern pattern = Pattern.compile(match);<a name="118" href="#118">118</a>         DatabaseEntry key = headKey;<a name="119" href="#119">119</a>         DatabaseEntry value = <strong>new</strong> DatabaseEntry();<a name="120" href="#120">120</a>         Cursor cursor = <strong>null</strong>;<a name="121" href="#121">121</a>         <strong>try</strong> {<a name="122" href="#122">122</a>             cursor = pendingUrisDB.openCursor(<strong>null</strong>, <strong>null</strong>);<a name="123" href="#123">123</a>             OperationStatus result = cursor.getSearchKeyRange(headKey,<a name="124" href="#124">124</a>                     value, <strong>null</strong>);<a name="125" href="#125">125</a> <a name="126" href="#126">126</a>             <strong>while</strong> (result == OperationStatus.SUCCESS) {<a name="127" href="#127">127</a>                 <strong>if</strong>(value.getData().length>0) {<a name="128" href="#128">128</a>                     <a href="../../../../org/archive/crawler/datamodel/CrawlURI.html">CrawlURI</a> curi = (CrawlURI) crawlUriBinding<a name="129" href="#129">129</a>                             .entryToObject(value);<a name="130" href="#130">130</a>                     <strong>if</strong> (!curi.getClassKey().equals(queue)) {<a name="131" href="#131">131</a>                         <em class="comment">// rolled into next queue; finished with this queue</em><a name="132" href="#132">132</a>                         <strong>break</strong>;<a name="133" href="#133">133</a>                     }<a name="134" href="#134">134</a>                     <strong>if</strong> (pattern.matcher(curi.toString()).matches()) {<a name="135" href="#135">135</a>                         cursor.delete();<a name="136" href="#136">136</a>                         deletedCount++;

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
在线观看日韩高清av| 一区二区三区在线视频观看| 欧美激情自拍偷拍| 亚洲欧洲一区二区三区| 亚洲不卡av一区二区三区| 美女爽到高潮91| 粉嫩高潮美女一区二区三区| 欧美亚洲综合久久| 精品国产一区二区亚洲人成毛片 | 亚洲黄色尤物视频| 日韩高清不卡在线| 丁香激情综合国产| 欧美日韩极品在线观看一区| 久久精品亚洲麻豆av一区二区| 18欧美亚洲精品| 日本亚洲视频在线| www.激情成人| 欧美精品1区2区3区| 欧美激情一区不卡| 婷婷中文字幕综合| 成人免费观看视频| 欧美电影在哪看比较好| 国产日韩欧美精品一区| 亚洲午夜免费福利视频| 久久99国产精品免费网站| 色呦呦日韩精品| 久久美女高清视频| 亚洲五码中文字幕| 国产成人福利片| 欧美色图片你懂的| 国产精品天干天干在线综合| 蜜臀精品久久久久久蜜臀| 99精品欧美一区| 欧美第一区第二区| 亚洲曰韩产成在线| 国产成人99久久亚洲综合精品| 欧美一区午夜精品| 亚洲精品欧美综合四区| 久久99久国产精品黄毛片色诱| 欧洲人成人精品| 国产精品毛片a∨一区二区三区| 六月婷婷色综合| 欧美色涩在线第一页| 国产精品网曝门| 久久国产视频网| 欧美日韩一区二区欧美激情| 中文字幕一区二区三区视频| 国产在线精品一区在线观看麻豆| 欧美天天综合网| 成人欧美一区二区三区黑人麻豆 | 日本久久精品电影| 国产亚洲人成网站| av亚洲精华国产精华精| 日韩一区二区三区观看| 亚洲第一会所有码转帖| 91免费看片在线观看| 久久久精品国产99久久精品芒果| 久久精品国产亚洲aⅴ | 日韩电影一区二区三区四区| 一本大道久久a久久综合| 国产精品天干天干在观线| 国产精品一区专区| 久久综合狠狠综合| 精品综合久久久久久8888| 欧美一级日韩免费不卡| 日韩电影一区二区三区四区| 欧美精选一区二区| 日韩精品五月天| 欧美日韩国产另类不卡| 亚洲高清一区二区三区| 欧美视频自拍偷拍| 亚洲va欧美va国产va天堂影院| 欧美主播一区二区三区美女| 夜夜亚洲天天久久| 欧美日韩国产高清一区二区| 亚瑟在线精品视频| 在线电影欧美成精品| 香蕉成人啪国产精品视频综合网| 欧美三级在线播放| 亚洲一级二级在线| 欧美日韩成人综合在线一区二区| 性欧美大战久久久久久久久| 欧美日韩在线直播| 日韩精品色哟哟| 欧美不卡一区二区三区| 精品一区二区三区久久| 26uuu国产在线精品一区二区| 国产麻豆精品视频| 国产精品成人免费在线| 97se狠狠狠综合亚洲狠狠| 亚洲乱码国产乱码精品精小说 | 亚洲卡通动漫在线| 色乱码一区二区三区88 | 91精品国产综合久久福利软件| 日本不卡中文字幕| 精品国产一区二区三区不卡| 粉嫩aⅴ一区二区三区四区| 综合久久综合久久| 欧美性猛交一区二区三区精品| 婷婷成人激情在线网| 日韩一区二区三区免费观看| 国产精品一区一区三区| 亚洲欧美激情在线| 欧美少妇性性性| 狠狠色狠狠色综合| 国产精品久久久久久久久免费相片| 99久久精品免费看| 亚洲成av人片一区二区梦乃| 日韩欧美二区三区| 成人福利视频在线| 亚洲永久免费av| 精品成人一区二区| 99天天综合性| 午夜精品一区二区三区电影天堂| 2023国产精品视频| av网站一区二区三区| 日韩福利视频网| 国产精品午夜久久| 欧美日本在线看| 高清在线成人网| 亚洲自拍与偷拍| 久久夜色精品国产噜噜av| 91香蕉视频在线| 美女网站在线免费欧美精品| 国产精品三级久久久久三级| 欧美日韩国产一区| 成人a区在线观看| 奇米在线7777在线精品| 中文字幕综合网| 欧美成人女星排行榜| 在线亚洲精品福利网址导航| 国产在线精品一区二区三区不卡| 亚洲免费观看在线视频| 亚洲乱码精品一二三四区日韩在线| 69久久夜色精品国产69蝌蚪网 | 中文字幕欧美一区| 日韩一卡二卡三卡四卡| 99国产精品久| 极品少妇一区二区| 亚洲综合精品久久| 国产欧美va欧美不卡在线| 欧美日韩综合在线| av电影在线观看不卡| 另类欧美日韩国产在线| 一区二区日韩电影| 欧美激情一区二区三区四区| 欧美一级欧美三级在线观看| 色呦呦一区二区三区| 国产成人精品影视| 青青草国产精品97视觉盛宴 | 欧美少妇xxx| 国产a级毛片一区| 麻豆极品一区二区三区| 亚洲丝袜制服诱惑| 国产亚洲精品久| 精品欧美久久久| 欧美人牲a欧美精品| 色先锋aa成人| www.日韩大片| 国产91精品精华液一区二区三区 | 91在线无精精品入口| 国产一区二区看久久| 日韩av不卡在线观看| 亚洲图片欧美综合| 综合电影一区二区三区| 国产日韩欧美不卡在线| 日韩欧美国产一区二区三区| 欧美日韩国产在线播放网站| 在线观看视频一区二区| 91视频xxxx| 不卡的看片网站| 粉嫩嫩av羞羞动漫久久久 | 国产精品网曝门| 久久久久久亚洲综合| 欧美大白屁股肥臀xxxxxx| 337p亚洲精品色噜噜狠狠| 欧美视频精品在线| 欧美亚洲动漫另类| 在线影院国内精品| 欧美视频自拍偷拍| 欧美视频日韩视频在线观看| 欧美色图天堂网| 欧美一a一片一级一片| 欧美午夜精品电影| 欧美日精品一区视频| 欧美久久高跟鞋激| 日韩一区二区三区视频| 91.com视频| 日韩欧美一区二区视频| 精品国产91亚洲一区二区三区婷婷| 日韩你懂的在线播放| 精品国产三级电影在线观看| 精品人在线二区三区| 26uuu国产一区二区三区| 国产性天天综合网| 国产欧美一区二区精品性| 日本一区二区三区高清不卡| 国产精品久久久久aaaa樱花| 自拍偷拍欧美激情| 一区二区三区国产精品|