亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關(guān)于我們
? 蟲蟲下載站

?? bdbmultipleworkqueues.html

?? 網(wǎng)絡(luò)爬蟲開源代碼
?? HTML
?? 第 1 頁 / 共 4 頁
字號:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"><html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en"><head><meta http-equiv="content-type" content="text/html; charset=UTF-8" /><title>BdbMultipleWorkQueues xref</title><link type="text/css" rel="stylesheet" href="../../../../stylesheet.css" /></head><body><div id="overview"><a href="../../../../../apidocs/org/archive/crawler/frontier/BdbMultipleWorkQueues.html">View Javadoc</a></div><pre><a name="1" href="#1">1</a>   <em class="comment">/*<em class="comment"> BdbMultipleWorkQueues</em></em><a name="2" href="#2">2</a>   <em class="comment"> * </em><a name="3" href="#3">3</a>   <em class="comment"> * Created on Dec 24, 2004</em><a name="4" href="#4">4</a>   <em class="comment"> *</em><a name="5" href="#5">5</a>   <em class="comment"> * Copyright (C) 2004 Internet Archive.</em><a name="6" href="#6">6</a>   <em class="comment"> * </em><a name="7" href="#7">7</a>   <em class="comment"> * This file is part of the Heritrix web crawler (crawler.archive.org).</em><a name="8" href="#8">8</a>   <em class="comment"> * </em><a name="9" href="#9">9</a>   <em class="comment"> * Heritrix is free software; you can redistribute it and/or modify</em><a name="10" href="#10">10</a>  <em class="comment"> * it under the terms of the GNU Lesser Public License as published by</em><a name="11" href="#11">11</a>  <em class="comment"> * the Free Software Foundation; either version 2.1 of the License, or</em><a name="12" href="#12">12</a>  <em class="comment"> * any later version.</em><a name="13" href="#13">13</a>  <em class="comment"> * </em><a name="14" href="#14">14</a>  <em class="comment"> * Heritrix is distributed in the hope that it will be useful, </em><a name="15" href="#15">15</a>  <em class="comment"> * but WITHOUT ANY WARRANTY; without even the implied warranty of</em><a name="16" href="#16">16</a>  <em class="comment"> * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the</em><a name="17" href="#17">17</a>  <em class="comment"> * GNU Lesser Public License for more details.</em><a name="18" href="#18">18</a>  <em class="comment"> * </em><a name="19" href="#19">19</a>  <em class="comment"> * You should have received a copy of the GNU Lesser Public License</em><a name="20" href="#20">20</a>  <em class="comment"> * along with Heritrix; if not, write to the Free Software</em><a name="21" href="#21">21</a>  <em class="comment"> * Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA  02111-1307  USA</em><a name="22" href="#22">22</a>  <em class="comment"> */</em><a name="23" href="#23">23</a>  <strong>package</strong> <a href="../../../../org/archive/crawler/frontier/package-summary.html">org.archive.crawler.frontier</a>;<a name="24" href="#24">24</a>  <a name="25" href="#25">25</a>  <strong>import</strong> java.io.UnsupportedEncodingException;<a name="26" href="#26">26</a>  <strong>import</strong> java.math.BigInteger;<a name="27" href="#27">27</a>  <strong>import</strong> java.util.ArrayList;<a name="28" href="#28">28</a>  <strong>import</strong> java.util.List;<a name="29" href="#29">29</a>  <strong>import</strong> java.util.logging.Level;<a name="30" href="#30">30</a>  <strong>import</strong> java.util.logging.Logger;<a name="31" href="#31">31</a>  <strong>import</strong> java.util.regex.Pattern;<a name="32" href="#32">32</a>  <a name="33" href="#33">33</a>  <strong>import</strong> org.archive.crawler.datamodel.CrawlURI;<a name="34" href="#34">34</a>  <strong>import</strong> org.archive.crawler.framework.FrontierMarker;<a name="35" href="#35">35</a>  <strong>import</strong> org.archive.util.ArchiveUtils;<a name="36" href="#36">36</a>  <a name="37" href="#37">37</a>  <strong>import</strong> com.sleepycat.bind.serial.StoredClassCatalog;<a name="38" href="#38">38</a>  <strong>import</strong> com.sleepycat.je.Cursor;<a name="39" href="#39">39</a>  <strong>import</strong> com.sleepycat.je.Database;<a name="40" href="#40">40</a>  <strong>import</strong> com.sleepycat.je.DatabaseConfig;<a name="41" href="#41">41</a>  <strong>import</strong> com.sleepycat.je.DatabaseEntry;<a name="42" href="#42">42</a>  <strong>import</strong> com.sleepycat.je.DatabaseException;<a name="43" href="#43">43</a>  <strong>import</strong> com.sleepycat.je.DatabaseNotFoundException;<a name="44" href="#44">44</a>  <strong>import</strong> com.sleepycat.je.Environment;<a name="45" href="#45">45</a>  <strong>import</strong> com.sleepycat.je.OperationStatus;<a name="46" href="#46">46</a>  <strong>import</strong> com.sleepycat.util.RuntimeExceptionWrapper;<a name="47" href="#47">47</a>  <a name="48" href="#48">48</a>  <a name="49" href="#49">49</a>  <em>/**<em>*</em></em><a name="50" href="#50">50</a>  <em> * A BerkeleyDB-database-backed structure for holding ordered</em><a name="51" href="#51">51</a>  <em> * groupings of CrawlURIs. Reading the groupings from specific</em><a name="52" href="#52">52</a>  <em> * per-grouping (per-classKey/per-Host) starting points allows</em><a name="53" href="#53">53</a>  <em> * this to act as a collection of independent queues. </em><a name="54" href="#54">54</a>  <em> * </em><a name="55" href="#55">55</a>  <em> * &lt;p>For how the bdb keys are made, see {@link #calculateInsertKey(CrawlURI)}.</em><a name="56" href="#56">56</a>  <em> * </em><a name="57" href="#57">57</a>  <em> * &lt;p>TODO: refactor, improve naming.</em><a name="58" href="#58">58</a>  <em> * </em><a name="59" href="#59">59</a>  <em> * @author gojomo</em><a name="60" href="#60">60</a>  <em> */</em><a name="61" href="#61">61</a>  <strong>public</strong> <strong>class</strong> <a href="../../../../org/archive/crawler/frontier/BdbMultipleWorkQueues.html">BdbMultipleWorkQueues</a> {<a name="62" href="#62">62</a>  	<strong>private</strong> <strong>static</strong> <strong>final</strong> <strong>long</strong> serialVersionUID = <a href="../../../../org/archive/util/ArchiveUtils.html">ArchiveUtils</a><a name="63" href="#63">63</a>      	.<strong>class</strong>nameBasedUID(BdbMultipleWorkQueues.<strong>class</strong>, 1);<a name="64" href="#64">64</a>  	<a name="65" href="#65">65</a>      <strong>private</strong> <strong>static</strong> <strong>final</strong> Logger LOGGER =<a name="66" href="#66">66</a>          Logger.getLogger(BdbMultipleWorkQueues.<strong>class</strong>.getName());<a name="67" href="#67">67</a>      <a name="68" href="#68">68</a>      <em>/**<em>* Database holding all pending URIs, grouped in virtual queues */</em></em><a name="69" href="#69">69</a>      <strong>private</strong> Database pendingUrisDB = <strong>null</strong>;<a name="70" href="#70">70</a>      <a name="71" href="#71">71</a>      <em>/**<em>*  Supporting bdb serialization of CrawlURIs */</em></em><a name="72" href="#72">72</a>      <strong>private</strong> <a href="../../../../org/archive/crawler/frontier/RecyclingSerialBinding.html">RecyclingSerialBinding</a> crawlUriBinding;<a name="73" href="#73">73</a>  <a name="74" href="#74">74</a>      <em>/**<em>*</em></em><a name="75" href="#75">75</a>  <em>     * Create the multi queue in the given environment. </em><a name="76" href="#76">76</a>  <em>     * </em><a name="77" href="#77">77</a>  <em>     * @param env bdb environment to use</em><a name="78" href="#78">78</a>  <em>     * @param classCatalog Class catalog to use.</em><a name="79" href="#79">79</a>  <em>     * @param recycle True if we are to reuse db content if any.</em><a name="80" href="#80">80</a>  <em>     * @throws DatabaseException</em><a name="81" href="#81">81</a>  <em>     */</em><a name="82" href="#82">82</a>      <strong>public</strong> <a href="../../../../org/archive/crawler/frontier/BdbMultipleWorkQueues.html">BdbMultipleWorkQueues</a>(Environment env,<a name="83" href="#83">83</a>          StoredClassCatalog classCatalog, <strong>final</strong> <strong>boolean</strong> recycle)<a name="84" href="#84">84</a>      throws DatabaseException {<a name="85" href="#85">85</a>          <em class="comment">// Open the database. Create it if it does not already exist. </em><a name="86" href="#86">86</a>          DatabaseConfig dbConfig = <strong>new</strong> DatabaseConfig();<a name="87" href="#87">87</a>          dbConfig.setAllowCreate(<strong>true</strong>);<a name="88" href="#88">88</a>          <strong>if</strong> (!recycle) {<a name="89" href="#89">89</a>              <strong>try</strong> {<a name="90" href="#90">90</a>                  env.truncateDatabase(<strong>null</strong>, <span class="string">"pending"</span>, false);<a name="91" href="#91">91</a>              } <strong>catch</strong> (DatabaseNotFoundException e) {<a name="92" href="#92">92</a>                  <em class="comment">// Ignored</em><a name="93" href="#93">93</a>              }<a name="94" href="#94">94</a>          }<a name="95" href="#95">95</a>          <em class="comment">// Make database deferred write: URLs that are added then removed </em><a name="96" href="#96">96</a>          <em class="comment">// before a page-out is required need never cause disk IO.</em><a name="97" href="#97">97</a>          dbConfig.setDeferredWrite(<strong>true</strong>);<a name="98" href="#98">98</a>  <a name="99" href="#99">99</a>          <strong>this</strong>.pendingUrisDB = env.openDatabase(<strong>null</strong>, <span class="string">"pending"</span>, dbConfig);<a name="100" href="#100">100</a>         crawlUriBinding =<a name="101" href="#101">101</a>             <strong>new</strong> <a href="../../../../org/archive/crawler/frontier/RecyclingSerialBinding.html">RecyclingSerialBinding</a>(<strong>class</strong>Catalog, CrawlURI.<strong>class</strong>);<a name="102" href="#102">102</a>     }<a name="103" href="#103">103</a> <a name="104" href="#104">104</a>     <em>/**<em>*</em></em><a name="105" href="#105">105</a> <em>     * Delete all CrawlURIs matching the given expression.</em><a name="106" href="#106">106</a> <em>     * </em><a name="107" href="#107">107</a> <em>     * @param match</em><a name="108" href="#108">108</a> <em>     * @param queue</em><a name="109" href="#109">109</a> <em>     * @param headKey</em><a name="110" href="#110">110</a> <em>     * @return count of deleted items</em><a name="111" href="#111">111</a> <em>     * @throws DatabaseException</em><a name="112" href="#112">112</a> <em>     * @throws DatabaseException</em><a name="113" href="#113">113</a> <em>     */</em><a name="114" href="#114">114</a>     <strong>public</strong> <strong>long</strong> deleteMatchingFromQueue(String match, String queue,<a name="115" href="#115">115</a>             DatabaseEntry headKey) throws DatabaseException {<a name="116" href="#116">116</a>         <strong>long</strong> deletedCount = 0;<a name="117" href="#117">117</a>         Pattern pattern = Pattern.compile(match);<a name="118" href="#118">118</a>         DatabaseEntry key = headKey;<a name="119" href="#119">119</a>         DatabaseEntry value = <strong>new</strong> DatabaseEntry();<a name="120" href="#120">120</a>         Cursor cursor = <strong>null</strong>;<a name="121" href="#121">121</a>         <strong>try</strong> {<a name="122" href="#122">122</a>             cursor = pendingUrisDB.openCursor(<strong>null</strong>, <strong>null</strong>);<a name="123" href="#123">123</a>             OperationStatus result = cursor.getSearchKeyRange(headKey,<a name="124" href="#124">124</a>                     value, <strong>null</strong>);<a name="125" href="#125">125</a> <a name="126" href="#126">126</a>             <strong>while</strong> (result == OperationStatus.SUCCESS) {<a name="127" href="#127">127</a>                 <strong>if</strong>(value.getData().length>0) {<a name="128" href="#128">128</a>                     <a href="../../../../org/archive/crawler/datamodel/CrawlURI.html">CrawlURI</a> curi = (CrawlURI) crawlUriBinding<a name="129" href="#129">129</a>                             .entryToObject(value);<a name="130" href="#130">130</a>                     <strong>if</strong> (!curi.getClassKey().equals(queue)) {<a name="131" href="#131">131</a>                         <em class="comment">// rolled into next queue; finished with this queue</em><a name="132" href="#132">132</a>                         <strong>break</strong>;<a name="133" href="#133">133</a>                     }<a name="134" href="#134">134</a>                     <strong>if</strong> (pattern.matcher(curi.toString()).matches()) {<a name="135" href="#135">135</a>                         cursor.delete();<a name="136" href="#136">136</a>                         deletedCount++;

?? 快捷鍵說明

復(fù)制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
1024精品合集| 天堂蜜桃一区二区三区| 日韩视频永久免费| 91黄色在线观看| av亚洲精华国产精华精华| 美女诱惑一区二区| 亚洲成av人片| 亚洲永久免费av| 亚洲欧洲制服丝袜| www国产成人| 在线播放一区二区三区| 99热国产精品| 成人黄色在线看| 成人激情av网| av电影天堂一区二区在线观看| 国产盗摄视频一区二区三区| 国产真实乱子伦精品视频| 免费成人结看片| 热久久免费视频| 无码av中文一区二区三区桃花岛| 亚洲国产乱码最新视频| 国产精品高清亚洲| 久久免费精品国产久精品久久久久| 制服丝袜激情欧洲亚洲| 91精品国产入口| 日韩一区二区三区电影在线观看 | 国产精品二三区| 亚洲国产精品二十页| 中文字幕va一区二区三区| 国产精品久久久久久久久快鸭 | 91久久精品日日躁夜夜躁欧美| 91丨九色丨国产丨porny| 成人avav影音| 色婷婷香蕉在线一区二区| 91激情五月电影| 欧美日韩一级片在线观看| 91麻豆精品国产| 欧美一区二区福利在线| 欧美一区二区三区日韩视频| 日韩欧美成人激情| 欧美精品一区二区高清在线观看| 欧美精品一区二区蜜臀亚洲| 中文一区一区三区高中清不卡| 国产精品国产三级国产普通话99| 亚洲视频一二区| 婷婷丁香久久五月婷婷| 久久er99热精品一区二区| 国产成人在线视频播放| 国产成人精品免费网站| 91香蕉国产在线观看软件| 欧美精品视频www在线观看| 日韩欧美精品在线| 国产精品区一区二区三区| 亚洲欧美视频在线观看视频| 午夜精品aaa| 国模少妇一区二区三区| 99热这里都是精品| 精品国产污污免费网站入口 | 亚洲乱码国产乱码精品精的特点| 亚洲国产精品影院| 国产精品正在播放| 日本久久一区二区| 欧美videossexotv100| 欧美激情一区二区三区四区| 伊人夜夜躁av伊人久久| 麻豆精品国产传媒mv男同| 久久er99精品| 成人免费视频视频在线观看免费 | 亚洲午夜精品在线| 久久99国产精品久久| 不卡av在线网| 日韩免费在线观看| 亚洲精品久久7777| 久久国产人妖系列| 99国产精品久久久久久久久久久| 91精品国产综合久久精品图片| 国产色产综合产在线视频| 一区二区成人在线视频 | 免费高清视频精品| 色婷婷精品久久二区二区蜜臂av| 欧美一区二区三区日韩| 亚洲精品大片www| 国产精品一二一区| 欧美视频在线观看一区二区| 欧美成人欧美edvon| 亚洲中国最大av网站| 国产91精品一区二区麻豆网站| 欧美日韩一区精品| 亚洲天堂成人在线观看| 国产精品一区专区| 91精品国产高清一区二区三区蜜臀| 国产日韩成人精品| 亚洲va欧美va天堂v国产综合| 成人综合婷婷国产精品久久| 欧美本精品男人aⅴ天堂| 亚洲大型综合色站| 99久精品国产| 中文av一区二区| 极品少妇xxxx偷拍精品少妇| 欧美老肥妇做.爰bbww视频| 国产女人aaa级久久久级| 婷婷亚洲久悠悠色悠在线播放| 一本大道久久a久久精品综合| 欧美国产精品专区| 国产一区二区主播在线| 91国偷自产一区二区开放时间| 久久久久国色av免费看影院| 美国三级日本三级久久99| 欧美美女网站色| 亚洲国产一区二区在线播放| 99re热视频精品| 中文字幕一区二区三区四区不卡| 国产在线播精品第三| 日韩精品一区二区三区中文不卡| 一二三区精品视频| 一本大道久久a久久精二百| 亚洲在线免费播放| 色视频成人在线观看免| 亚洲欧美欧美一区二区三区| voyeur盗摄精品| 亚洲欧美一区二区在线观看| 国产成人av电影在线播放| 日韩免费看的电影| 久久99国产精品久久| 久久综合色鬼综合色| 久久99精品久久久久婷婷| 日韩女优毛片在线| 韩国精品久久久| 久久精品一区二区三区四区| 天天av天天翘天天综合网 | 亚洲一区二区三区四区五区黄| bt7086福利一区国产| 久久久久久99精品| 成人激情动漫在线观看| 国产日韩v精品一区二区| 成人午夜电影网站| 亚洲天堂av老司机| 不卡一卡二卡三乱码免费网站| 亚洲一区二区视频在线观看| 欧美日韩久久久久久| 青青草一区二区三区| 久久久亚洲综合| 国产91丝袜在线18| 亚洲精品一二三四区| 欧美午夜理伦三级在线观看| 五月天丁香久久| 欧美一级生活片| 不卡的av网站| 美国三级日本三级久久99 | 岛国精品在线观看| 亚洲主播在线播放| 国产偷国产偷精品高清尤物| 在线观看亚洲精品视频| 国产一区二区免费视频| 一区二区三区在线看| 精品国产乱码久久久久久老虎| 91小视频在线| 国产一区二区三区免费看 | 国产精品亚洲成人| 亚洲综合在线观看视频| 久久久99精品免费观看| 亚洲一区影音先锋| 精品日韩成人av| 91成人在线观看喷潮| 国产精品一区二区久久精品爱涩| 亚洲激情第一区| 国产日韩欧美综合一区| 日韩亚洲欧美一区二区三区| 91蜜桃免费观看视频| 国产精品一级片在线观看| 日韩高清欧美激情| 一区二区三区在线观看动漫| 久久人人97超碰com| 91精品国产手机| 91精品国产综合久久久久久久 | 日韩成人免费在线| 国产揄拍国内精品对白| 欧美影视一区在线| 国产精品乱人伦| 国产91在线看| 色婷婷亚洲婷婷| 17c精品麻豆一区二区免费| 久久精品在这里| 欧美一区二区视频在线观看2022 | 日本人妖一区二区| 亚洲综合免费观看高清完整版在线 | 中文字幕字幕中文在线中不卡视频| 欧美大片日本大片免费观看| 欧美怡红院视频| 色婷婷亚洲婷婷| 91一区二区三区在线播放| 国产91精品一区二区| 国产综合色在线| 黑人巨大精品欧美黑白配亚洲| 亚洲6080在线| 婷婷久久综合九色国产成人| 亚洲国产乱码最新视频| 亚洲精品水蜜桃| 亚洲男人的天堂一区二区 | 国产激情精品久久久第一区二区|