亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來(lái)到蟲(chóng)蟲(chóng)下載站! | ?? 資源下載 ?? 資源專(zhuān)輯 ?? 關(guān)于我們
? 蟲(chóng)蟲(chóng)下載站

?? parallelreader.java

?? lucene-2.4.0 是一個(gè)全文收索的工具包
?? JAVA
?? 第 1 頁(yè) / 共 2 頁(yè)
字號(hào):
package org.apache.lucene.index;/** * Licensed to the Apache Software Foundation (ASF) under one or more * contributor license agreements.  See the NOTICE file distributed with * this work for additional information regarding copyright ownership. * The ASF licenses this file to You under the Apache License, Version 2.0 * (the "License"); you may not use this file except in compliance with * the License.  You may obtain a copy of the License at * *     http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */import org.apache.lucene.document.Document;import org.apache.lucene.document.FieldSelector;import org.apache.lucene.document.FieldSelectorResult;import org.apache.lucene.document.Fieldable;import java.io.IOException;import java.util.*;/** An IndexReader which reads multiple, parallel indexes.  Each index added * must have the same number of documents, but typically each contains * different fields.  Each document contains the union of the fields of all * documents with the same document number.  When searching, matches for a * query term are from the first index added that has the field. * * <p>This is useful, e.g., with collections that have large fields which * change rarely and small fields that change more frequently.  The smaller * fields may be re-indexed in a new index and both indexes may be searched * together. * * <p><strong>Warning:</strong> It is up to you to make sure all indexes * are created and modified the same way. For example, if you add * documents to one index, you need to add the same documents in the * same order to the other indexes. <em>Failure to do so will result in * undefined behavior</em>. */public class ParallelReader extends IndexReader {  private List readers = new ArrayList();  private List decrefOnClose = new ArrayList(); // remember which subreaders to decRef on close  boolean incRefReaders = false;  private SortedMap fieldToReader = new TreeMap();  private Map readerToFields = new HashMap();  private List storedFieldReaders = new ArrayList();  private int maxDoc;  private int numDocs;  private boolean hasDeletions; /** Construct a ParallelReader.   * <p>Note that all subreaders are closed if this ParallelReader is closed.</p>  */  public ParallelReader() throws IOException { this(true); }    /** Construct a ParallelReader.   * @param closeSubReaders indicates whether the subreaders should be closed  * when this ParallelReader is closed  */  public ParallelReader(boolean closeSubReaders) throws IOException {    super();    this.incRefReaders = !closeSubReaders;  } /** Add an IndexReader.  * @throws IOException if there is a low-level IO error  */  public void add(IndexReader reader) throws IOException {    ensureOpen();    add(reader, false);  } /** Add an IndexReader whose stored fields will not be returned.  This can  * accellerate search when stored fields are only needed from a subset of  * the IndexReaders.  *  * @throws IllegalArgumentException if not all indexes contain the same number  *     of documents  * @throws IllegalArgumentException if not all indexes have the same value  *     of {@link IndexReader#maxDoc()}  * @throws IOException if there is a low-level IO error  */  public void add(IndexReader reader, boolean ignoreStoredFields)    throws IOException {    ensureOpen();    if (readers.size() == 0) {      this.maxDoc = reader.maxDoc();      this.numDocs = reader.numDocs();      this.hasDeletions = reader.hasDeletions();    }    if (reader.maxDoc() != maxDoc)                // check compatibility      throw new IllegalArgumentException        ("All readers must have same maxDoc: "+maxDoc+"!="+reader.maxDoc());    if (reader.numDocs() != numDocs)      throw new IllegalArgumentException        ("All readers must have same numDocs: "+numDocs+"!="+reader.numDocs());    Collection fields = reader.getFieldNames(IndexReader.FieldOption.ALL);    readerToFields.put(reader, fields);    Iterator i = fields.iterator();    while (i.hasNext()) {                         // update fieldToReader map      String field = (String)i.next();      if (fieldToReader.get(field) == null)        fieldToReader.put(field, reader);    }    if (!ignoreStoredFields)      storedFieldReaders.add(reader);             // add to storedFieldReaders    readers.add(reader);        if (incRefReaders) {      reader.incRef();    }    decrefOnClose.add(Boolean.valueOf(incRefReaders));  }  /**   * Tries to reopen the subreaders.   * <br>   * If one or more subreaders could be re-opened (i. e. subReader.reopen()    * returned a new instance != subReader), then a new ParallelReader instance    * is returned, otherwise this instance is returned.   * <p>   * A re-opened instance might share one or more subreaders with the old    * instance. Index modification operations result in undefined behavior   * when performed before the old instance is closed.   * (see {@link IndexReader#reopen()}).   * <p>   * If subreaders are shared, then the reference count of those   * readers is increased to ensure that the subreaders remain open   * until the last referring reader is closed.   *    * @throws CorruptIndexException if the index is corrupt   * @throws IOException if there is a low-level IO error    */  public IndexReader reopen() throws CorruptIndexException, IOException {    ensureOpen();        boolean reopened = false;    List newReaders = new ArrayList();    List newDecrefOnClose = new ArrayList();        boolean success = false;        try {          for (int i = 0; i < readers.size(); i++) {        IndexReader oldReader = (IndexReader) readers.get(i);        IndexReader newReader = oldReader.reopen();        newReaders.add(newReader);        // if at least one of the subreaders was updated we remember that        // and return a new MultiReader        if (newReader != oldReader) {          reopened = true;        }      }        if (reopened) {        ParallelReader pr = new ParallelReader();        for (int i = 0; i < readers.size(); i++) {          IndexReader oldReader = (IndexReader) readers.get(i);          IndexReader newReader = (IndexReader) newReaders.get(i);          if (newReader == oldReader) {            newDecrefOnClose.add(Boolean.TRUE);            newReader.incRef();          } else {            // this is a new subreader instance, so on close() we don't            // decRef but close it             newDecrefOnClose.add(Boolean.FALSE);          }          pr.add(newReader, !storedFieldReaders.contains(oldReader));        }        pr.decrefOnClose = newDecrefOnClose;        pr.incRefReaders = incRefReaders;        success = true;        return pr;      } else {        success = true;        // No subreader was refreshed        return this;      }    } finally {      if (!success && reopened) {        for (int i = 0; i < newReaders.size(); i++) {          IndexReader r = (IndexReader) newReaders.get(i);          if (r != null) {            try {              if (((Boolean) newDecrefOnClose.get(i)).booleanValue()) {                r.decRef();              } else {                r.close();              }            } catch (IOException ignore) {              // keep going - we want to clean up as much as possible            }          }        }      }    }  }  public int numDocs() {    // Don't call ensureOpen() here (it could affect performance)    return numDocs;  }  public int maxDoc() {    // Don't call ensureOpen() here (it could affect performance)    return maxDoc;  }  public boolean hasDeletions() {    // Don't call ensureOpen() here (it could affect performance)    return hasDeletions;  }  // check first reader  public boolean isDeleted(int n) {    // Don't call ensureOpen() here (it could affect performance)    if (readers.size() > 0)      return ((IndexReader)readers.get(0)).isDeleted(n);    return false;  }  // delete in all readers  protected void doDelete(int n) throws CorruptIndexException, IOException {    for (int i = 0; i < readers.size(); i++) {      ((IndexReader)readers.get(i)).deleteDocument(n);    }    hasDeletions = true;  }  // undeleteAll in all readers  protected void doUndeleteAll() throws CorruptIndexException, IOException {    for (int i = 0; i < readers.size(); i++) {      ((IndexReader)readers.get(i)).undeleteAll();    }    hasDeletions = false;  }  // append fields from storedFieldReaders  public Document document(int n, FieldSelector fieldSelector) throws CorruptIndexException, IOException {    ensureOpen();    Document result = new Document();    for (int i = 0; i < storedFieldReaders.size(); i++) {      IndexReader reader = (IndexReader)storedFieldReaders.get(i);      boolean include = (fieldSelector==null);      if (!include) {        Iterator it = ((Collection) readerToFields.get(reader)).iterator();        while (it.hasNext())          if (fieldSelector.accept((String)it.next())!=FieldSelectorResult.NO_LOAD) {            include = true;            break;          }      }      if (include) {        Iterator fieldIterator = reader.document(n, fieldSelector).getFields().iterator();        while (fieldIterator.hasNext()) {          result.add((Fieldable)fieldIterator.next());        }      }    }    return result;  }  // get all vectors  public TermFreqVector[] getTermFreqVectors(int n) throws IOException {    ensureOpen();    ArrayList results = new ArrayList();    Iterator i = fieldToReader.entrySet().iterator();    while (i.hasNext()) {      Map.Entry e = (Map.Entry)i.next();      String field = (String)e.getKey();      IndexReader reader = (IndexReader)e.getValue();      TermFreqVector vector = reader.getTermFreqVector(n, field);      if (vector != null)        results.add(vector);    }    return (TermFreqVector[])      results.toArray(new TermFreqVector[results.size()]);  }  public TermFreqVector getTermFreqVector(int n, String field)    throws IOException {    ensureOpen();    IndexReader reader = ((IndexReader)fieldToReader.get(field));    return reader==null ? null : reader.getTermFreqVector(n, field);  }  public void getTermFreqVector(int docNumber, String field, TermVectorMapper mapper) throws IOException {    ensureOpen();

?? 快捷鍵說(shuō)明

復(fù)制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號(hào) Ctrl + =
減小字號(hào) Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
美女www一区二区| 国产成人精品三级麻豆| 国产在线精品一区在线观看麻豆| 亚洲精品视频在线| 国产日韩欧美一区二区三区乱码 | 亚洲一区免费观看| 精品在线一区二区| 欧美精品一卡二卡| 亚洲乱码国产乱码精品精小说| 国产精品中文字幕日韩精品| 在线成人av网站| 亚洲午夜精品一区二区三区他趣| 成人精品视频网站| 久久久久久久久久久久久女国产乱| 亚洲国产中文字幕在线视频综合| 成人avav在线| 欧美国产一区二区| 高清不卡一二三区| 久久精品亚洲麻豆av一区二区| 久久国产综合精品| 日韩三级av在线播放| 丝袜诱惑亚洲看片| 欧美日韩在线亚洲一区蜜芽| 亚洲一区二区欧美日韩| 欧美日韩国产一级片| 久久嫩草精品久久久久| 久久午夜老司机| 亚洲国产精品久久不卡毛片| 亚洲黄网站在线观看| 欧美激情一区不卡| 亚洲久本草在线中文字幕| av成人老司机| 国产精品久久久99| 99热这里都是精品| 亚洲欧美综合色| 日本精品视频一区二区| 一区二区三区四区视频精品免费| 色婷婷精品大视频在线蜜桃视频| 亚洲精品久久久久久国产精华液| 在线一区二区三区四区| 亚洲午夜免费电影| 欧美视频日韩视频在线观看| 日韩和欧美的一区| 精品女同一区二区| 国产一区二区毛片| 欧美一区二区三区日韩| 久久综合一区二区| 国产成都精品91一区二区三| 国产精品黄色在线观看| 在线观看不卡一区| 中文字幕一区二区三区蜜月| 精品影院一区二区久久久| 欧美一区二区三区四区视频| 久久99蜜桃精品| 国产欧美日本一区视频| 色综合久久综合网欧美综合网| 午夜精品久久久久久| 日韩精品中文字幕在线一区| 成人黄色软件下载| 日产国产欧美视频一区精品| 国产午夜精品一区二区| 欧美在线你懂得| 国产精品影视天天线| 亚洲欧美另类图片小说| 日韩视频在线永久播放| 97久久精品人人做人人爽50路| 午夜精品一区二区三区三上悠亚| 久久蜜桃av一区二区天堂 | 国产精品久久福利| 欧美群妇大交群中文字幕| 国产精品88888| 亚洲成a人v欧美综合天堂下载| 精品免费一区二区三区| 欧美熟乱第一页| 成人在线综合网| 日本不卡高清视频| 亚洲色图视频网| 国产偷国产偷精品高清尤物| 欧美日韩三级一区二区| av成人老司机| 国产精品77777竹菊影视小说| 午夜亚洲福利老司机| 国产精品久久免费看| 精品欧美一区二区久久| 在线91免费看| 色狠狠一区二区三区香蕉| 成人综合在线观看| 久久色成人在线| 欧美精三区欧美精三区| 国产成人免费视频| 美女视频黄久久| 亚洲成人综合在线| 一区二区免费在线播放| 色综合天天综合色综合av| 国产欧美一二三区| 亚洲精品视频在线观看网站| www激情久久| 欧美一区二区三区在线观看| 欧美在线一区二区| 91精品1区2区| 欧美系列在线观看| 99国产精品国产精品久久| 成人午夜短视频| 成人动漫一区二区三区| 国产精品1区2区3区在线观看| 久久97超碰色| 加勒比av一区二区| 国产乱子轮精品视频| 精品在线播放午夜| 国产综合久久久久久久久久久久 | 日韩成人av影视| 日韩精品成人一区二区三区| 亚洲国产精品久久人人爱蜜臀| 一区二区视频在线| 亚洲图片欧美综合| 亚洲3atv精品一区二区三区| 91豆麻精品91久久久久久| 狂野欧美性猛交blacked| av亚洲产国偷v产偷v自拍| 亚洲6080在线| 日韩精品乱码免费| 久久99久久99精品免视看婷婷| 麻豆国产欧美日韩综合精品二区| 蜜臀久久久99精品久久久久久| 久久99久国产精品黄毛片色诱| 狠狠色伊人亚洲综合成人| 国产福利91精品| 色综合久久九月婷婷色综合| 欧美日韩视频专区在线播放| 91精品午夜视频| 国产亚洲精品超碰| 亚洲人成网站色在线观看| 亚洲永久精品国产| 蜜桃传媒麻豆第一区在线观看| 久久不见久久见免费视频7| 成人一区二区三区中文字幕| 91视频在线观看| 欧美一区中文字幕| 国产日韩欧美精品电影三级在线| 国产精品美女久久久久av爽李琼 | 久久国产福利国产秒拍| 欧美国产亚洲另类动漫| 午夜精品福利视频网站| 久久精品欧美日韩精品| 久久婷婷综合激情| 久久亚洲精品国产精品紫薇| 欧美一区二区三区播放老司机| 欧美三级中文字幕在线观看| 欧美成人女星排行榜| 亚洲欧美另类小说视频| 蜜臀av在线播放一区二区三区| 国产成人午夜高潮毛片| 色婷婷香蕉在线一区二区| 欧美tk丨vk视频| 亚洲欧美色综合| 另类综合日韩欧美亚洲| 色噜噜狠狠一区二区三区果冻| 欧美va日韩va| 午夜精品一区在线观看| 成人在线综合网| 丁香激情综合国产| 成人av在线网站| 精品裸体舞一区二区三区| 亚洲女人的天堂| 国产电影一区在线| 亚洲影院理伦片| 欧美日韩免费高清一区色橹橹| 日韩欧美123| 亚洲国产日韩av| 国产在线精品一区二区不卡了| 色诱视频网站一区| 国产欧美日韩三区| 蜜臀av在线播放一区二区三区| 色综合天天综合给合国产| 精品欧美久久久| 日韩国产高清在线| 一本久久综合亚洲鲁鲁五月天| 国产亚洲欧美日韩在线一区| 日韩精品每日更新| 精品视频123区在线观看| 亚洲欧美日韩在线| av电影在线观看一区| 久久久久成人黄色影片| 久热成人在线视频| 欧美一区2区视频在线观看| 一区二区三区在线视频播放| 成人v精品蜜桃久久一区| 久久亚洲免费视频| 国内精品免费**视频| 欧美一区二区免费视频| 丝袜国产日韩另类美女| 精品视频在线免费观看| 亚洲激情综合网| 色婷婷一区二区三区四区| 亚洲欧洲国产日韩| 成人开心网精品视频| 中文无字幕一区二区三区| 国产a区久久久| 国产精品无人区| av男人天堂一区|