亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? urifind

?? urifind the uri from specially text samples or files. nice.
??
字號:
#!/usr/local/bin/perl -w# ----------------------------------------------------------------------# $Id: urifind,v 1.2 2003/07/10 20:54:09 dlc Exp $ # ----------------------------------------------------------------------# urifind - find URIs in a document and dump them to STDOUT.# Copyright (C) 2003 darren chamberlain <darren@cpan.org># ----------------------------------------------------------------------use strict;use vars qw($VERSION $REVISION);$VERSION = 1.00;$REVISION = sprintf "%d.%02d", q$Revision: 1.2 $ =~ /(\d+)\.(\d+)/;use File::Basename qw(basename);use Getopt::Long qw(GetOptions);use IO::File;use URI::Find;# What to do, and howmy $help = 0;my $version = 0;my $sort = 0;my $reverse = 0;my $unique = 0;my $prefix = 0;my $noprefix = 0;my @pats = ();my @schemes = ();my $dump = 0;Getopt::Long::Configure(qw{no_ignore_case bundling});GetOptions('s!'   => \$sort,           'u!'   => \$unique,           'p!'   => \$prefix,           'n!'   => \$noprefix,           'r!'   => \$reverse,           'h!'   => \$help,           'v!'   => \$version,           'd!'   => sub { $dump = 1 },           'D!'   => sub { $dump = 2 },           'P=s@' => \@pats,           'S=s@' => \@schemes);if ($help || $version) {    my $prog = basename($0);    if ($help) {        print <<HELP;$prog - find URIs in a document and dump them to STDOUT.    $prog [OPTIONS] file1 [file2[, file3[, ...]]]Options:    -s          Sort results.    -r          Reverse sort results (implies -s).    -u          Return unique results only.    -n          Don't include filename in output.    -p          Include filename in output (0 by default, but 1 if                multiple files are included on the command line).    -P \$re      Print only lines matching regex '\$re'                 (may be specified multiple times).    -S \$scheme  Only this scheme (may be specified multiple times).    -h          This help screen.    -v          Display version and exit.    -d          Dump compiled regexes and continue.    -D          Dump compiled regexes and exit.HELP    }    else {        printf "$prog v.%.02f\n", $VERSION;    }    exit(0);}my (@uris, $count);unshift @ARGV, \*STDIN unless @ARGV;if (($prefix + $noprefix) > 1) {    my $prog = basename $0;    die "Can't specify -p and -n at the same time; try $prog -h\n";}# Print filename with matches?  -p / -n# If there is more than one file, then show filenames by# default, unless explicitly asked not to (-n)if (@ARGV > 1) {    $prefix = 1 unless $noprefix;}else {    $prefix = 0 unless $prefix;}# Add schemes to the list of regexenif (@schemes) {    unshift @pats => sprintf '^(\b%s\b):' => join '\b|\b' => @schemes;}# If we are dumping (-d, -D), then dump.  Exit if -D.if ($dump) {    print STDERR "\$scheme = '" . (defined $pats[0] ? $pats[0] : '') . "'\n";    print STDERR "\@pats = ('" . join("', '", @pats) . "')\n";    exit if $dump == 2;}# Find the URIsfor my $argv (@ARGV) {    my ($name, $fh, $data);    $argv = \*STDIN if ($argv eq '-');    if (ref $argv eq 'GLOB') {        local $/;        $data = <$argv>;        $name = '<stdin>'    }    else {        local $/;        $fh = IO::File->new($argv) or die "Can't open $argv: $!";        $data = <$fh>;        $name = $argv;    }    my $finder = URI::Find->new(sub { push @uris => [ $name, $_[0] ] });    $finder->find(\$data);}# Apply patterns, in @patsfor my $pat (@pats) {    @uris = grep { $_->[1] =~ /$pat/ } @uris;}# Remove redundant linksif ($unique) {    my %unique;    @uris = grep { ++$unique{$_->[1]} == 1 } @uris;}# Sort links, possibly in reverseif ($sort || $reverse) {    if ($reverse) {        @uris = sort { $b->[1] cmp $a->[1] } @uris;    }    else {        @uris = sort { $a->[1] cmp $b->[1] } @uris;    }}# Flatten the arrayrefsif ($prefix) {    @uris = map { join ': ' => @$_ } @uris;}else {    @uris = map { $_->[1] } @uris;}print map { "$_\n" } @uris;exit 0;__END__=head1 NAMEurifind - find URIs in a document and dump them to STDOUT.=head1 SYNOPSIS    $ urifind file=head1 DESCRIPTIONF<urifind> is a simple script that finds URIs in one or more files(using C<URI::Find>), and outputs them to to STDOUT.  That's it.To find all the URIs in F<file1>, use:    $ urifind file1To find the URIs in multiple files, simply list them as arguments:    $ urifind file1 file2 file3F<urifind> will read from C<STDIN> if no files are given or if afilename of C<-> is specified:    $ wget http://www.boston.com/ -O - | urifindWhen multiple files are listed, F<urifind> prefixes each found URIwith the file from which it came:    $ urifind file1 file2    file1: http://www.boston.com/index.html    file2: http://use.perl.org/This can be turned on for single files with the C<-p> ("prefix") switch:    $urifind -p file3    file1: http://fsck.com/rt/It can also be turned off for multiple files with the C<-n> ("noprefix") switch:    $ urifind file1 file2    http://www.boston.com/index.html    http://use.perl.org/By default, URIs will be displayed in the order found; to sort themascii-betically, use the C<-s> ("sort") option.  To reverse sort them,use the C<-r> ("reverse") flag (C<-r> implies C<-s>).    $ urifind -s file1 file2    http://use.perl.org/    http://www.boston.com/index.html    mailto:webmaster@boston.com    $ urifind -r file1 file2    mailto:webmaster@boston.com    http://www.boston.com/index.html    http://use.perl.org/Finally, F<urifind> supports limiting the returned URIs by scheme orby arbitrary pattern, using the C<-S> option (for schemes) and theC<-P> option.  Both C<-S> and C<-P> can be specified multiple times:    $ urifind -S mailto file1    mailto:webmaster@boston.com    $ urifind -S mailto -S http    mailto:webmaster@boston.com    http://www.boston.com/index.htmlC<-P> takes an arbitrary Perl regex.  It might need to be protectedfrom the shell:    $ urifind -P 's?html?' file1    http://www.boston.com/index.html    $ urifind -P '\.org\b' -S http file4    http://www.gnu.org/software/wget/wget.htmlAdd a C<-d> to have F<urifind> dump the refexen generated from C<-S>and C<-P> to C<STDERR>.  C<-D> does the same but exits immediately:    $ urifind -P '\.org\b' -S http -D     $scheme = '^(\bhttp\b):'    @pats = ('^(\bhttp\b):', '\.org\b')To remove duplicates from the results, use the C<-u> ("unique")switch.=head1 OPTION SUMMARY=over 4=item -sSort results.=item -rReverse sort results (implies -s).=item -uReturn unique results only.=item -nDon't include filename in output.=item -pInclude filename in output (0 by default, but 1 if multiple files areincluded on the command line).=item -P $rePrint only lines matching regex '$re' (may be specified multiple times).=item -S $schemeOnly this scheme (may be specified multiple times).=item -hHelp summary.=item -vDisplay version and exit.=item -dDump compiled regexes for C<-S> and C<-P> to C<STDERR>.=item -DSame as C<-d>, but exit after dumping.=back=head1 VERSIONThis is F<urifind>, revision $Revision: 1.2 $.=head1 AUTHORdarren chamberlain E<lt>darren@cpan.orgE<gt>=head1 COPYRIGHT(C) 2003 darren chamberlainThis library is free software; you may distribute it and/or modify itunder the same terms as Perl itself.=head1 SEE ALSOL<Perl>

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
久久电影网站中文字幕| 亚洲另类色综合网站| 欧美三级中文字幕在线观看| 国产精品一级二级三级| 精品亚洲成a人| 国产一区二区免费在线| 国产一区二三区好的| 国产综合色视频| 国产大片一区二区| 成人激情校园春色| 色八戒一区二区三区| 欧美性猛交xxxx黑人交| 欧美日韩国产高清一区二区三区| 在线观看亚洲精品| 日韩一区二区免费在线观看| 精品久久国产老人久久综合| 欧美v日韩v国产v| 国产欧美日韩在线观看| 亚洲日本丝袜连裤袜办公室| 亚洲综合视频在线| 久久99精品国产.久久久久久| 国产一区二区视频在线| www.亚洲在线| 在线播放日韩导航| 久久久久久麻豆| 一区二区三区中文字幕在线观看| 秋霞国产午夜精品免费视频| 国产福利不卡视频| 欧洲一区二区av| www一区二区| 一级日本不卡的影视| 日本中文字幕不卡| 94-欧美-setu| 精品sm捆绑视频| 亚洲国产一区二区视频| 精品制服美女丁香| 欧美在线视频日韩| 久久精品亚洲精品国产欧美| 亚洲精品高清视频在线观看| 蜜臀av性久久久久蜜臀av麻豆| 成人国产免费视频| 精品日韩成人av| 亚洲第一久久影院| 99视频精品免费视频| 日韩欧美在线观看一区二区三区| 亚洲国产经典视频| 黑人巨大精品欧美一区| 99久久精品国产毛片| 2020国产精品自拍| 婷婷六月综合网| 色婷婷久久久亚洲一区二区三区| 2019国产精品| 久草在线在线精品观看| 91精品福利视频| 17c精品麻豆一区二区免费| 精品一区二区三区在线观看国产| 欧美午夜宅男影院| 亚洲精品日韩一| 波多野结衣在线aⅴ中文字幕不卡| 欧美一区二区网站| 午夜不卡av免费| 欧美视频三区在线播放| 亚洲少妇30p| 成人黄色大片在线观看| 日本一区二区免费在线| 久久国产视频网| 日韩欧美一区二区不卡| 日日嗨av一区二区三区四区| 色丁香久综合在线久综合在线观看| 国产精品免费看片| 国产老肥熟一区二区三区| 欧美一二三区在线| 老汉av免费一区二区三区| 欧美理论在线播放| 五月天国产精品| 欧美二区在线观看| 日韩电影在线免费看| 91麻豆精品国产91久久久更新时间| 夜夜嗨av一区二区三区| 91在线观看免费视频| 1024亚洲合集| 91福利精品第一导航| 亚洲午夜在线观看视频在线| 欧美日韩成人在线一区| 三级久久三级久久久| 日韩一区二区在线观看| 麻豆精品视频在线| 久久久精品免费网站| 国产一区二区三区国产| 中文字幕精品—区二区四季| av激情亚洲男人天堂| 亚洲精品国产a| 欧美一区二区三区精品| 精品一区在线看| 国产欧美一区二区精品久导航| 99国产精品久久久久久久久久久| 亚洲日本韩国一区| 欧美一级片在线| 国产精品资源在线观看| 国产精品视频免费| 欧美三级在线看| 极品少妇xxxx精品少妇| 久久色中文字幕| 99精品视频在线观看| 婷婷激情综合网| 国产喂奶挤奶一区二区三区| 91蜜桃免费观看视频| 男男成人高潮片免费网站| 亚洲国产精品精华液2区45| 91国产免费观看| 国产精品主播直播| 一区二区三区在线观看动漫| 日韩一本二本av| 在线观看亚洲a| 国产高清成人在线| 日韩综合小视频| 中文字幕一区二区5566日韩| 欧美一级一级性生活免费录像| 99久久综合狠狠综合久久| 奇米一区二区三区| 伊人色综合久久天天| 精品久久久久一区二区国产| 色成年激情久久综合| 国产一区二区三区在线看麻豆| 一级精品视频在线观看宜春院| 亚洲精品一区二区在线观看| 欧美亚洲禁片免费| 91在线视频网址| 国产精品自拍毛片| 九九九精品视频| 日本91福利区| 午夜精品aaa| 亚洲图片欧美综合| 亚洲精品水蜜桃| 亚洲国产精品精华液2区45| 精品美女被调教视频大全网站| 在线观看成人小视频| 不卡视频在线观看| 丁香婷婷深情五月亚洲| 国内欧美视频一区二区| 日韩精品高清不卡| 日日夜夜精品视频免费 | 亚洲成av人片www| 亚洲男人的天堂在线观看| 欧美国产激情一区二区三区蜜月 | 激情综合色播激情啊| 日韩精品一级二级 | 91精品国产综合久久蜜臀| 欧美性受极品xxxx喷水| 色哟哟欧美精品| 色吊一区二区三区| 91久久精品一区二区| 91精品福利视频| 欧美日本在线一区| 91精品国产91久久久久久最新毛片| 在线视频中文字幕一区二区| 在线观看视频一区二区欧美日韩| 色一情一伦一子一伦一区| 91香蕉视频污在线| 欧美三级在线视频| 欧美一区二区三区精品| 日韩欧美激情在线| 国产偷国产偷精品高清尤物| 欧美激情一区在线| 亚洲欧美激情在线| 舔着乳尖日韩一区| 精品一区二区三区在线观看国产| 国产一区二区三区久久久| 成人在线综合网| 色哟哟欧美精品| 91精品久久久久久蜜臀| 亚洲精品在线观| 亚洲区小说区图片区qvod| 亚洲一区精品在线| 激情六月婷婷综合| 91在线精品一区二区| 欧美伊人久久大香线蕉综合69 | 麻豆精品新av中文字幕| 国产精品一区二区x88av| aaa欧美大片| 欧美欧美午夜aⅴ在线观看| 日韩久久久精品| 中文乱码免费一区二区| 亚洲一区二区三区在线播放| 久久精品国产久精国产爱| 成人精品一区二区三区中文字幕| 欧美午夜电影在线播放| 日韩欧美的一区| 亚洲黄色录像片| 久久电影网电视剧免费观看| 97精品电影院| 日韩免费在线观看| 最新日韩在线视频| 精品一区二区三区在线视频| 99精品视频一区二区| 日韩午夜在线观看| 一区二区三区四区乱视频| 久久电影网站中文字幕| 欧日韩精品视频| 久久免费午夜影院|