亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? gopherenum-breadth.c

?? harvest是一個下載html網頁得機器人
?? C
?? 第 1 頁 / 共 2 頁
字號:
static char rcsid[] = "$Id: gopherenum-breadth.c,v 2.5 2000/02/03 12:45:56 sxw Exp $";/* *  gopherenum-breadth.c - RootNode URL enumerator for Gopher URLs * *  Usage: gopherenum-breadth gopher-URL * *  Outputs the following format: * *      URL of tree root *      URL <tab> md5 *      ... *      URL <tab> md5 * *  DEBUG: section  43, level 1, 5, 9   Gatherer enumeration for Gopher *  AUTHOR: Harvest derived * *  Harvest Indexer http://harvest.sourceforge.net/ *  ----------------------------------------------- * *  The Harvest Indexer is a continued development of code developed by *  the Harvest Project. Development is carried out by numerous individuals *  in the Internet community, and is not officially connected with the *  original Harvest Project or its funding sources. * *  Please mail lee@arco.de if you are interested in participating *  in the development effort. * *  This program is free software; you can redistribute it and/or modify *  it under the terms of the GNU General Public License as published by *  the Free Software Foundation; either version 2 of the License, or *  (at your option) any later version. * *  This program is distributed in the hope that it will be useful, *  but WITHOUT ANY WARRANTY; without even the implied warranty of *  MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the *  GNU General Public License for more details. * *  You should have received a copy of the GNU General Public License *  along with this program; if not, write to the Free Software *  Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA. *//*  ---------------------------------------------------------------------- *  Copyright (c) 1994, 1995.  All rights reserved. * *    The Harvest software was developed by the Internet Research Task *    Force Research Group on Resource Discovery (IRTF-RD): * *          Mic Bowman of Transarc Corporation. *          Peter Danzig of the University of Southern California. *          Darren R. Hardy of the University of Colorado at Boulder. *          Udi Manber of the University of Arizona. *          Michael F. Schwartz of the University of Colorado at Boulder. *          Duane Wessels of the University of Colorado at Boulder. * *    This copyright notice applies to software in the Harvest *    ``src/'' directory only.  Users should consult the individual *    copyright notices in the ``components/'' subdirectories for *    copyright information about other software bundled with the *    Harvest source code distribution. * *  TERMS OF USE * *    The Harvest software may be used and re-distributed without *    charge, provided that the software origin and research team are *    cited in any use of the system.  Most commonly this is *    accomplished by including a link to the Harvest Home Page *    (gopher://harvest.cs.colorado.edu/) from the query page of any *    Broker you deploy, as well as in the query result pages.  These *    links are generated automatically by the standard Broker *    software distribution. * *    The Harvest software is provided ``as is'', without express or *    implied warranty, and with no support nor obligation to assist *    in its use, correction, modification or enhancement.  We assume *    no liability with respect to the infringement of copyrights, *    trade secrets, or any patents, and are not responsible for *    consequential damages.  Proper use of the Harvest software is *    entirely the responsibility of the user. * *  DERIVATIVE WORKS * *    Users may make derivative works from the Harvest software, subject *    to the following constraints: * *      - You must include the above copyright notice and these *        accompanying paragraphs in all forms of derivative works, *        and any documentation and other materials related to such *        distribution and use acknowledge that the software was *        developed at the above institutions. * *      - You must notify IRTF-RD regarding your distribution of *        the derivative work. * *      - You must clearly notify users that your are distributing *        a modified version and not the original Harvest software. * *      - Any derivative product is also subject to these copyright *        and use restrictions. * *    Note that the Harvest software is NOT in the public domain.  We *    retain copyright, as specified above. * *  HISTORY OF FREE SOFTWARE STATUS * *    Originally we required sites to license the software in cases *    where they were going to build commercial products/services *    around Harvest.  In June 1995 we changed this policy.  We now *    allow people to use the core Harvest software (the code found in *    the Harvest ``src/'' directory) for free.  We made this change *    in the interest of encouraging the widest possible deployment of *    the technology.  The Harvest software is really a reference *    implementation of a set of protocols and formats, some of which *    we intend to standardize.  We encourage commercial *    re-implementations of code complying to this set of standards. * */#include <stdio.h>#include <stdlib.h>#include <unistd.h>#include <memory.h>#include <string.h>#include <signal.h>#include <gdbm.h>#include "util.h"#include "url.h"#define PUBLIC extern#include "filter.h"typedef struct _list_t {    void *ptr;    int depth;    struct _list_t *next;} list_t;list_t *head = NULL;list_t **Tail = NULL;/* define HOST_COUNT_IP to 'count' visited hosts based on IP, not the   *//* given hostname.  This way aliased machines will be properly          *//* enumerated                                                           */#define HOST_COUNT_IP/* Global variables */int max_depth = 0;int cur_depth = 0;int depth_hist[100];/* Local variables */static int url_max = 0;static int nurls = 0;static int host_max = 0;static int nhosts = 0;static char *tree_root = NULL;static char *urldb_filename = NULL;static char *hostdb_filename = NULL;static char *md5db_filename = NULL;static GDBM_FILE urldbf = NULL;static GDBM_FILE hostdbf = NULL;static GDBM_FILE md5dbf = NULL;static FILE *not_visited = NULL;/* Local functions */static void usage();static void mark_failed();static void mark_retrieved();static void sigdie();static int url_in_db();static int md5_in_db();static int gopher_enum();extern int RobotsTxtCheck _PARAMS((URL *));list_t *add_to_list(url, depth)     char *url;     int depth;{    list_t *l = NULL;    l = (list_t *) xmalloc(sizeof(list_t));    l->ptr = (void *) xstrdup(url);    l->next = (list_t *) NULL;    l->depth = depth;    *Tail = l;    Tail = &(l->next);    return l;}list_t *free_from_list(l)     list_t *l;{    list_t *r = NULL;    r = l->next;    xfree(l->ptr);    xfree(l);    return r;}/* ---------------------------------------------------------------------- *//* *  mark_failed() - Mark that a URL failed to be retrieved, so that the *  enumerator doesn't try it again. This option may not be wanted by *  some users and so should be configurable. */static void mark_failed(URL *up) {    datum k,d;    Debug(43, 9, ("mark_failed: url='%s'",up->url));    k.dptr = xstrdup(up->url);    k.dsize = strlen(k.dptr) + 1;    d.dptr = xstrdup("FailedAccess");    d.dsize = strlen(d.dptr) + 1;    if (!gdbm_exists(urldbf, k) && gdbm_store(urldbf, k, d, GDBM_INSERT))        fatal("GDBM URLDB: %s: %s", k.dptr, gdbm_strerror(gdbm_errno));    xfree(k.dptr);    xfree(d.dptr);}/* *  mark_retrieved() - Mark that the given URL was successfully retrieved, *  so that the URL is not retrieved again.  This prevents cycles in the *  enumeration. */static void mark_retrieved(up)     URL *up;{    datum k, d;    Debug(43, 9, ("mark_retrieved: url='%s', md5='%s'\n", up->url, up->md5));    k.dptr = xstrdup(up->url);    k.dsize = strlen(k.dptr) + 1;    d.dptr = xstrdup(up->md5);    d.dsize = strlen(d.dptr) + 1;    if (!gdbm_exists(urldbf, k) && gdbm_store(urldbf, k, d, GDBM_INSERT))	fatal("GDBM URLDB: %s: %s", k.dptr, gdbm_strerror(gdbm_errno));    if (!gdbm_exists(md5dbf, d) && gdbm_store(md5dbf, d, k, GDBM_INSERT))	fatal("GDBM MD5DB: %s: %s", k.dptr, gdbm_strerror(gdbm_errno));    xfree(k.dptr);    xfree(d.dptr);    /* Print URL to stdout to enumerate; flush to keep pipe moving */    fprintf(stdout, "%s\t%s\n", up->url, up->md5);	/* URL <tab> MD5 */    fflush(stdout);    if (nurls++ >= url_max) {	Log("Truncating RootNode %s at %d LeafNode URLs\n",	    tree_root, url_max);	url_close(up);	up = NULL;	sigdie(0);    }}/* *  url_in_db() - check to see if the URL is in the database */static int url_in_db(url)     char *url;{    datum k;    int r;    Debug(43, 9, ("url_in_db: checking for url='%s'\n", url));    k.dptr = xstrdup(url);    k.dsize = strlen(k.dptr) + 1;    r = gdbm_exists(urldbf, k);    xfree(k.dptr);    return (r);}/* *  md5_in_db() - check to see if the MD5 is in the database */static int md5_in_db(md5)     char *md5;{    datum k;    int r;    k.dptr = xstrdup(md5);    k.dsize = strlen(k.dptr) + 1;    r = gdbm_exists(md5dbf, k);    xfree(k.dptr);    return (r);}/* *  host_in_db() - check to see if the host is in the database */static int host_in_db(host)     char *host;{    datum k;    int r;#ifdef HOST_COUNT_IP    Host *h;    h = get_host(host);    if (!h)	return 0;    k.dptr = xstrdup(h->dotaddr);#else    k.dptr = xstrdup(host);#endif    k.dsize = strlen(k.dptr) + 1;    r = gdbm_exists(hostdbf, k);    xfree(k.dptr);    return (r);}/* *  visit_server() - Determine if we should visit the server.  Return *  zero if we should not process the URL; otherwise, return non-zero. */static int visit_server(up)     URL *up;{    datum k, d;#ifdef HOST_COUNT_IP    Host *h = NULL;#endif    if (host_in_db(up->host))	/* Host is already in the db */	return (1);    if (++nhosts > host_max)	return (0);#ifdef HOST_COUNT_IP    h = get_host(up->host);    if (!h)	return (0);    k.dptr = xstrdup(h->dotaddr);#else    k.dptr = xstrdup(up->host);#endif    k.dsize = strlen(k.dptr) + 1;    d.dptr = xstrdup(up->url);    d.dsize = strlen(d.dptr) + 1;    if (gdbm_store(hostdbf, k, d, GDBM_INSERT))	fatal("GDBM HOSTDB: %s: %s", k.dptr, gdbm_strerror(gdbm_errno));    xfree(k.dptr);    xfree(d.dptr);    return (1);}int url_is_allowed(url)     char *url;{    URL *tup = NULL;

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
国产精品一区免费视频| 亚洲国产精品视频| 精品国产成人在线影院| 欧美一区二区精品| 久久影院电视剧免费观看| 欧美va日韩va| 日韩一区欧美一区| 性做久久久久久免费观看| 亚洲高清视频的网址| 日韩va亚洲va欧美va久久| 另类综合日韩欧美亚洲| 国产.精品.日韩.另类.中文.在线.播放| 国产精品18久久久| 欧美图片一区二区三区| 欧美一区二区三区免费大片| 久久久久久电影| 亚洲国产色一区| 日本午夜精品一区二区三区电影| 成人网在线免费视频| 日韩三级中文字幕| 亚洲bt欧美bt精品| 在线一区二区三区做爰视频网站| 久久久久国产精品麻豆ai换脸| 日本在线不卡一区| 欧美巨大另类极品videosbest | 中文在线一区二区 | 国产婷婷色一区二区三区| 婷婷一区二区三区| 国产精品一区二区在线播放| 91小视频在线| 久久久久久久免费视频了| 亚洲图片欧美综合| 色视频一区二区| 日韩毛片一二三区| k8久久久一区二区三区| 精品美女在线播放| 蜜桃av噜噜一区| 91麻豆国产精品久久| 久久综合狠狠综合久久激情 | 懂色av一区二区三区免费看| 欧美夫妻性生活| 亚洲自拍都市欧美小说| 99视频精品免费视频| 欧美精品一区二区三区蜜臀| 亚洲国产日韩a在线播放| 成人精品高清在线| 国产婷婷色一区二区三区四区| 另类调教123区| 6080yy午夜一二三区久久| 亚洲成人av一区二区| 91精品福利视频| 亚洲欧洲日韩综合一区二区| 风间由美一区二区三区在线观看 | 欧美三级在线看| 亚洲一区二区在线播放相泽| 一本大道久久a久久精品综合| 亚洲视频图片小说| 91日韩在线专区| 国产精品久久久久久久浪潮网站| 国产精品资源在线看| 91精品91久久久中77777| 国产精品久久久久aaaa樱花| 91久久国产最好的精华液| **欧美大码日韩| 精品视频在线免费| 日韩精品国产欧美| 久久久久高清精品| 色综合色狠狠综合色| 污片在线观看一区二区| 精品久久久久久久人人人人传媒| 麻豆精品一区二区av白丝在线| 欧美mv和日韩mv的网站| 国产.欧美.日韩| 亚洲成年人网站在线观看| 亚洲精品在线电影| 色诱亚洲精品久久久久久| 久久99精品久久久久久国产越南| 久久久久99精品国产片| 日韩亚洲欧美成人一区| 91久久线看在观草草青青| 成人看片黄a免费看在线| 国产毛片精品国产一区二区三区| 免费成人在线网站| 精品中文字幕一区二区小辣椒 | 日本va欧美va精品发布| 亚洲天天做日日做天天谢日日欢| 精品99一区二区| 91精品在线免费| 欧美高清激情brazzers| 欧美亚洲一区三区| 欧美日本免费一区二区三区| 不卡大黄网站免费看| 久久99精品网久久| 日本视频在线一区| 无码av中文一区二区三区桃花岛| 亚洲人成伊人成综合网小说| 国产欧美精品一区二区三区四区| 在线综合+亚洲+欧美中文字幕| 91福利精品视频| 日本高清成人免费播放| 91在线视频网址| 色婷婷激情一区二区三区| 国产成人在线视频网址| 日本午夜精品视频在线观看| 日韩精品免费专区| 日韩一区精品视频| 日本人妖一区二区| 日本欧美一区二区三区乱码| 亚洲高清三级视频| 五月激情综合婷婷| 麻豆91精品91久久久的内涵| 人禽交欧美网站| 激情五月婷婷综合| 国产99久久久国产精品免费看| 免费三级欧美电影| 国产91精品在线观看| 成人黄色一级视频| 91色.com| 日韩欧美在线观看一区二区三区| 日韩精品一区二区三区在线| 国产蜜臀97一区二区三区| 亚洲欧洲精品成人久久奇米网| 国产精品热久久久久夜色精品三区| 国产精品免费视频观看| 亚洲成人自拍一区| 精品综合久久久久久8888| 成人午夜短视频| 制服丝袜日韩国产| 中文字幕中文字幕一区二区 | 亚洲免费高清视频在线| 日韩制服丝袜av| 99久久精品国产导航| 日韩美女视频在线| 亚洲男女一区二区三区| 国产一区二区伦理片| 欧美日韩国产电影| 国产精品看片你懂得| 久久99精品国产麻豆婷婷洗澡| 国产成人福利片| 欧美成人精品3d动漫h| 国产精品久久久久9999吃药| 蜜臂av日日欢夜夜爽一区| 欧美亚洲一区三区| 中文字幕一区二区在线观看| 久久99久久99| 91精品久久久久久久久99蜜臂| 欧美国产精品中文字幕| 免费不卡在线视频| 日韩一区二区免费在线电影| 亚洲风情在线资源站| av中文字幕不卡| 中文字幕一区二区三区四区 | 久久丁香综合五月国产三级网站| 在线看一区二区| 一区二区三区中文字幕精品精品 | 久久嫩草精品久久久久| 麻豆精品视频在线| 日韩欧美aaaaaa| 九九国产精品视频| 久久嫩草精品久久久精品| 国产一区二区免费在线| 中文字幕欧美日韩一区| 粉嫩aⅴ一区二区三区四区五区| 久久色在线观看| 成人国产亚洲欧美成人综合网| 亚洲人成在线观看一区二区| aaa欧美色吧激情视频| 日韩欧美www| 99九九99九九九视频精品| 日韩专区中文字幕一区二区| 亚洲va欧美va人人爽午夜| 91精品午夜视频| av电影天堂一区二区在线观看| 久久精品国产999大香线蕉| 亚洲综合色网站| 一区二区三区国产精品| 欧美国产丝袜视频| 国产欧美一区二区三区网站 | 秋霞av亚洲一区二区三| 制服丝袜一区二区三区| 六月丁香婷婷久久| 中文字幕一区二区三区乱码在线| 欧美性色黄大片| 日本成人在线网站| 国产精品每日更新在线播放网址| 色悠悠久久综合| www.欧美日韩| 国产精品18久久久久久久久久久久 | 欧美亚洲国产一区二区三区va| av电影在线观看完整版一区二区| 91一区二区在线| 欧美综合久久久| 成人黄色av电影| 蜜桃久久av一区| 亚洲欧洲韩国日本视频| 精品久久久久久无| 欧美一区二区精品在线| 欧美人与z0zoxxxx视频| 99re热这里只有精品视频| 国产大片一区二区|