亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關(guān)于我們
? 蟲蟲下載站

?? sloppyphrasescorer.java

?? lucene-2.4.0 是一個(gè)全文收索的工具包
?? JAVA
字號(hào):
package org.apache.lucene.search;/** * Licensed to the Apache Software Foundation (ASF) under one or more * contributor license agreements.  See the NOTICE file distributed with * this work for additional information regarding copyright ownership. * The ASF licenses this file to You under the Apache License, Version 2.0 * (the "License"); you may not use this file except in compliance with * the License.  You may obtain a copy of the License at * *     http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */import org.apache.lucene.index.TermPositions;import java.io.IOException;import java.util.HashMap;final class SloppyPhraseScorer extends PhraseScorer {    private int slop;    private PhrasePositions repeats[];    private PhrasePositions tmpPos[]; // for flipping repeating pps.    private boolean checkedRepeats;    SloppyPhraseScorer(Weight weight, TermPositions[] tps, int[] offsets, Similarity similarity,                       int slop, byte[] norms) {        super(weight, tps, offsets, similarity, norms);        this.slop = slop;    }    /**     * Score a candidate doc for all slop-valid position-combinations (matches)      * encountered while traversing/hopping the PhrasePositions.     * <br> The score contribution of a match depends on the distance:      * <br> - highest score for distance=0 (exact match).     * <br> - score gets lower as distance gets higher.     * <br>Example: for query "a b"~2, a document "x a b a y" can be scored twice:      * once for "a b" (distance=0), and once for "b a" (distance=2).     * <br>Pssibly not all valid combinations are encountered, because for efficiency       * we always propagate the least PhrasePosition. This allows to base on      * PriorityQueue and move forward faster.      * As result, for example, document "a b c b a"     * would score differently for queries "a b c"~4 and "c b a"~4, although      * they really are equivalent.      * Similarly, for doc "a b c b a f g", query "c b"~2      * would get same score as "g f"~2, although "c b"~2 could be matched twice.     * We may want to fix this in the future (currently not, for performance reasons).     */    protected final float phraseFreq() throws IOException {        int end = initPhrasePositions();                float freq = 0.0f;        boolean done = (end<0);        while (!done) {            PhrasePositions pp = (PhrasePositions) pq.pop();            int start = pp.position;            int next = ((PhrasePositions) pq.top()).position;            boolean tpsDiffer = true;            for (int pos = start; pos <= next || !tpsDiffer; pos = pp.position) {                if (pos<=next && tpsDiffer)                    start = pos;                  // advance pp to min window                if (!pp.nextPosition()) {                    done = true;          // ran out of a term -- done                    break;                }                PhrasePositions pp2 = null;                tpsDiffer = !pp.repeats || (pp2 = termPositionsDiffer(pp))==null;                if (pp2!=null && pp2!=pp) {                  pp = flip(pp,pp2); // flip pp to pp2                }            }            int matchLength = end - start;            if (matchLength <= slop)                freq += getSimilarity().sloppyFreq(matchLength); // score match            if (pp.position > end)                end = pp.position;            pq.put(pp);               // restore pq        }        return freq;    }        // flip pp2 and pp in the queue: pop until finding pp2, insert back all but pp2, insert pp back.    // assumes: pp!=pp2, pp2 in pq, pp not in pq.    // called only when there are repeating pps.    private PhrasePositions flip(PhrasePositions pp, PhrasePositions pp2) {      int n=0;      PhrasePositions pp3;      //pop until finding pp2      while ((pp3=(PhrasePositions)pq.pop()) != pp2) {        tmpPos[n++] = pp3;      }      //insert back all but pp2      for (n--; n>=0; n--) {        pq.insert(tmpPos[n]);      }      //insert pp back      pq.put(pp);      return pp2;    }    /**     * Init PhrasePositions in place.     * There is a one time initialization for this scorer:     * <br>- Put in repeats[] each pp that has another pp with same position in the doc.     * <br>- Also mark each such pp by pp.repeats = true.     * <br>Later can consult with repeats[] in termPositionsDiffer(pp), making that check efficient.     * In particular, this allows to score queries with no repetitions with no overhead due to this computation.     * <br>- Example 1 - query with no repetitions: "ho my"~2     * <br>- Example 2 - query with repetitions: "ho my my"~2     * <br>- Example 3 - query with repetitions: "my ho my"~2     * <br>Init per doc w/repeats in query, includes propagating some repeating pp's to avoid false phrase detection.       * @return end (max position), or -1 if any term ran out (i.e. done)      * @throws IOException      */    private int initPhrasePositions() throws IOException {        int end = 0;                // no repeats at all (most common case is also the simplest one)        if (checkedRepeats && repeats==null) {            // build queue from list            pq.clear();            for (PhrasePositions pp = first; pp != null; pp = pp.next) {                pp.firstPosition();                if (pp.position > end)                    end = pp.position;                pq.put(pp);         // build pq from list            }            return end;        }                // position the pp's        for (PhrasePositions pp = first; pp != null; pp = pp.next)            pp.firstPosition();                // one time initializatin for this scorer        if (!checkedRepeats) {            checkedRepeats = true;            // check for repeats            HashMap m = null;            for (PhrasePositions pp = first; pp != null; pp = pp.next) {                int tpPos = pp.position + pp.offset;                for (PhrasePositions pp2 = pp.next; pp2 != null; pp2 = pp2.next) {                    int tpPos2 = pp2.position + pp2.offset;                    if (tpPos2 == tpPos) {                         if (m == null)                            m = new HashMap();                        pp.repeats = true;                        pp2.repeats = true;                        m.put(pp,null);                        m.put(pp2,null);                    }                }            }            if (m!=null)                repeats = (PhrasePositions[]) m.keySet().toArray(new PhrasePositions[0]);        }                // with repeats must advance some repeating pp's so they all start with differing tp's               if (repeats!=null) {            for (int i = 0; i < repeats.length; i++) {                PhrasePositions pp = repeats[i];                PhrasePositions pp2;                while ((pp2 = termPositionsDiffer(pp)) != null) {                  if (!pp2.nextPosition())  // out of pps that do not differ, advance the pp with higher offset                       return -1;           // ran out of a term -- done                  }             }        }              // build queue from list        pq.clear();        for (PhrasePositions pp = first; pp != null; pp = pp.next) {            if (pp.position > end)                end = pp.position;            pq.put(pp);         // build pq from list        }        if (repeats!=null) {          tmpPos = new PhrasePositions[pq.size()];        }        return end;    }    /**     * We disallow two pp's to have the same TermPosition, thereby verifying multiple occurrences      * in the query of the same word would go elsewhere in the matched doc.     * @return null if differ (i.e. valid) otherwise return the higher offset PhrasePositions     * out of the first two PPs found to not differ.     */    private PhrasePositions termPositionsDiffer(PhrasePositions pp) {        // efficiency note: a more efficient implementation could keep a map between repeating         // pp's, so that if pp1a, pp1b, pp1c are repeats term1, and pp2a, pp2b are repeats         // of term2, pp2a would only be checked against pp2b but not against pp1a, pp1b, pp1c.         // However this would complicate code, for a rather rare case, so choice is to compromise here.        int tpPos = pp.position + pp.offset;        for (int i = 0; i < repeats.length; i++) {            PhrasePositions pp2 = repeats[i];            if (pp2 == pp)                continue;            int tpPos2 = pp2.position + pp2.offset;            if (tpPos2 == tpPos)                return pp.offset > pp2.offset ? pp : pp2; // do not differ: return the one with higher offset.        }        return null;     }}

?? 快捷鍵說明

復(fù)制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號(hào) Ctrl + =
減小字號(hào) Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
国产精品国产三级国产普通话蜜臀| 在线亚洲一区观看| ㊣最新国产の精品bt伙计久久| 99re8在线精品视频免费播放| 亚洲18色成人| 欧美激情一区在线| 欧美另类z0zxhd电影| 国产剧情一区二区| 亚洲伊人色欲综合网| 欧美tk—视频vk| 国产成人夜色高潮福利影视| 亚洲精品第1页| 欧美成人性福生活免费看| 国产98色在线|日韩| 三级影片在线观看欧美日韩一区二区| 精品国产伦一区二区三区观看体验| 成人一道本在线| 免费高清视频精品| 亚洲精品高清视频在线观看| 久久五月婷婷丁香社区| 欧美日韩国产在线播放网站| 国产91丝袜在线播放| 日韩成人午夜电影| 亚洲黄色免费网站| 亚洲国产精品传媒在线观看| 在线不卡欧美精品一区二区三区| av不卡免费在线观看| 国产在线一区二区| 日本欧美一区二区三区乱码| 亚洲伦在线观看| 欧美国产综合色视频| 精品成人免费观看| 91麻豆精品91久久久久久清纯 | 国内偷窥港台综合视频在线播放| 亚洲精品国产精华液| 欧美一区二区三区人| 91视频在线看| 国产精品99久| 日本美女视频一区二区| 亚洲国产乱码最新视频 | 国产专区综合网| 免费人成精品欧美精品| 亚洲一区二区在线观看视频| 亚洲三级在线播放| √…a在线天堂一区| 国产欧美日韩精品a在线观看| 91精品国产综合久久蜜臀| 欧美在线制服丝袜| 色天使色偷偷av一区二区| 99久久免费精品高清特色大片| 国产成人在线视频网站| 国产夫妻精品视频| 久久精品久久99精品久久| 亚洲三级小视频| 精品日韩成人av| 欧美日韩亚洲另类| 欧美视频一区二区三区| 欧美成人综合网站| 精品免费国产一区二区三区四区| 欧美一区二区三区四区视频| 91精品国产综合久久精品麻豆| 欧美日韩夫妻久久| 欧美一级二级在线观看| 日韩欧美国产麻豆| 久久久噜噜噜久久人人看 | 欧美一区二区三区精品| 6080国产精品一区二区| 在线观看免费一区| 欧美午夜寂寞影院| 精品视频色一区| 一本高清dvd不卡在线观看| 99vv1com这只有精品| 一本大道综合伊人精品热热| 欧美午夜片在线观看| 欧美久久免费观看| wwwwxxxxx欧美| 欧美国产97人人爽人人喊| 中文字幕一区二区三区不卡在线| 亚洲人成在线播放网站岛国| 一区二区三区蜜桃| 五月婷婷激情综合| 精品综合免费视频观看| 国产成人午夜视频| av综合在线播放| 欧美日韩一区成人| 久久蜜臀中文字幕| 亚洲免费观看高清在线观看| 亚洲成人久久影院| 日韩在线观看一区二区| 激情五月婷婷综合| 国产成人午夜精品影院观看视频| av电影在线不卡| 在线不卡中文字幕播放| 久久久久久久久97黄色工厂| 亚洲色图在线播放| 久久福利视频一区二区| 成人app下载| 91麻豆精品国产综合久久久久久| 久久久久久久久久久电影| 一区二区三区四区精品在线视频| 蜜桃一区二区三区四区| 成人av影院在线| 欧美一区二区美女| 椎名由奈av一区二区三区| 五月婷婷久久综合| 高清不卡在线观看av| 欧美男人的天堂一二区| 欧美国产日韩亚洲一区| 天堂一区二区在线免费观看| 国产91丝袜在线18| 欧美视频在线一区二区三区| 久久精品这里都是精品| 一区二区三区欧美久久| 在线观看一区日韩| 国产午夜精品久久久久久免费视| 亚洲一区在线视频观看| 国产精品亚洲午夜一区二区三区| 欧美日韩国产成人在线免费| 国产精品久久久久婷婷| 蜜臀a∨国产成人精品| 91美女在线观看| 日韩精品专区在线| 亚洲欧美另类小说| 国产精品77777| 日韩一卡二卡三卡国产欧美| 亚洲精品乱码久久久久久久久| 国产精品1024| 精品久久久久久无| 午夜精品影院在线观看| 国产裸体歌舞团一区二区| 欧美人妇做爰xxxⅹ性高电影 | 日韩欧美激情四射| 亚洲国产精品久久不卡毛片| 色综合中文字幕国产 | 日韩精品最新网址| 视频一区二区三区在线| 欧美日韩综合色| 亚洲日本一区二区| 国产九九视频一区二区三区| 日韩欧美黄色影院| 蜜臀a∨国产成人精品| 在线播放中文一区| 亚洲欧美日韩国产综合在线| 国产sm精品调教视频网站| 亚洲精品在线网站| 久久国产精品99精品国产 | 人人爽香蕉精品| 欧美日韩高清一区| 亚洲18色成人| 欧美日韩国产一区二区三区地区| 亚洲精品国产第一综合99久久 | 国精品**一区二区三区在线蜜桃| 6080yy午夜一二三区久久| 视频在线观看91| 欧美一区二区三区四区在线观看| 日本特黄久久久高潮| 欧美亚洲一区二区三区四区| 亚洲高清视频中文字幕| 91麻豆精品国产无毒不卡在线观看| 一区二区三区日韩欧美精品| 精品视频在线看| 久久99精品久久久久久国产越南| 国产区在线观看成人精品| 色婷婷综合激情| 蜜桃av一区二区| 国产精品久久久久aaaa樱花 | 波波电影院一区二区三区| 亚洲摸摸操操av| 欧美一区二区视频在线观看| 久久精品国产99| 中文字幕在线观看不卡视频| 欧美人体做爰大胆视频| 国产成都精品91一区二区三| 亚洲午夜精品一区二区三区他趣| 欧美一级xxx| 9人人澡人人爽人人精品| 爽爽淫人综合网网站| 国产欧美日韩久久| 欧美精品日韩一区| 国产成人aaa| 日韩激情av在线| 成人欧美一区二区三区在线播放| 91精品婷婷国产综合久久性色| 成人黄色av电影| 欧美a级理论片| 亚洲欧美福利一区二区| 精品国产乱码久久久久久浪潮| 色欧美日韩亚洲| 国产成人av资源| 免费日韩伦理电影| 一区二区三区免费网站| 国产欧美日韩不卡| 91精品国产一区二区三区香蕉| 不卡视频免费播放| 国产自产v一区二区三区c| 亚洲成人黄色小说| 亚洲图片激情小说| 久久久久88色偷偷免费| 日韩欧美一区二区视频| 欧美日韩国产综合久久|