亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? hql.g

?? hibernate 開源框架的代碼 jar包希望大家能喜歡
?? G
?? 第 1 頁 / 共 2 頁
字號:
//             ( 4)  relational: <, <=, >, >=,//                   LIKE, NOT LIKE, BETWEEN, NOT BETWEEN, IN, NOT IN//             ( 3)  addition and subtraction: +(binary) -(binary)//             ( 2)  multiplication: * / %, concatenate: ||// highest --> ( 1)  +(unary) -(unary)//                   []   () (method call)  . (dot -- identifier qualification)//                   aggregate function//                   ()  (explicit parenthesis)//// Note that the above precedence levels map to the rules below...// Once you have a precedence chart, writing the appropriate rules as below// is usually very straightfowardlogicalExpression	: expression	;// Main expression ruleexpression	: logicalOrExpression	;// level 7 - ORlogicalOrExpression	: logicalAndExpression ( OR^ logicalAndExpression )*	;// level 6 - AND, NOTlogicalAndExpression	: negatedExpression ( AND^ negatedExpression )*	;// NOT nodes aren't generated.  Instead, the operator in the sub-tree will be// negated, if possible.   Expressions without a NOT parent are passed through.negatedExpression!{ weakKeywords(); } // Weak keywords can appear in an expression, so look ahead.	: NOT^ x:negatedExpression { #negatedExpression = negateNode(#x); }	| y:equalityExpression { #negatedExpression = #y; }	;//## OP: EQ | LT | GT | LE | GE | NE | SQL_NE | LIKE;// level 5 - EQ, NEequalityExpression	: x:relationalExpression (		( EQ^		| is:IS^	{ #is.setType(EQ); } (NOT! { #is.setType(NE); } )?		| NE^		| ne:SQL_NE^	{ #ne.setType(NE); }		) y:relationalExpression)* {			// Post process the equality expression to clean up 'is null', etc.			#equalityExpression = processEqualityExpression(#equalityExpression);		}	;// level 4 - LT, GT, LE, GE, LIKE, NOT LIKE, BETWEEN, NOT BETWEEN// NOTE: The NOT prefix for LIKE and BETWEEN will be represented in the// token type.  When traversing the AST, use the token type, and not the// token text to interpret the semantics of these nodes.relationalExpression	: concatenation (		( ( ( LT^ | GT^ | LE^ | GE^ ) additiveExpression )* )		// Disable node production for the optional 'not'.		| (n:NOT!)? (			// Represent the optional NOT prefix using the token type by			// testing 'n' and setting the token type accordingly.			(i:IN^ {					#i.setType( (n == null) ? IN : NOT_IN);					#i.setText( (n == null) ? "in" : "not in");				}				inList)			| (b:BETWEEN^ {					#b.setType( (n == null) ? BETWEEN : NOT_BETWEEN);					#b.setText( (n == null) ? "between" : "not between");				}				betweenList )			| (l:LIKE^ {					#l.setType( (n == null) ? LIKE : NOT_LIKE);					#l.setText( (n == null) ? "like" : "not like");				}				concatenation likeEscape)			| (MEMBER! (OF!)? p:path! {				processMemberOf(n,#p,currentAST);			  } ) )		)	;likeEscape	: (ESCAPE^ concatenation)?	;inList	: x:compoundExpr	{ #inList = #([IN_LIST,"inList"], #inList); }	;betweenList	: concatenation AND! concatenation	;//level 4 - string concatenationconcatenation	: additiveExpression 	( c:CONCAT^ { #c.setType(EXPR_LIST); #c.setText("concatList"); } 	  additiveExpression	  ( CONCAT! additiveExpression )* 	  { #concatenation = #([METHOD_CALL, "||"], #([IDENT, "concat"]), #c ); } )?	;// level 3 - binary plus and minusadditiveExpression	: multiplyExpression ( ( PLUS^ | MINUS^ ) multiplyExpression )*	;// level 2 - binary multiply and dividemultiplyExpression	: unaryExpression ( ( STAR^ | DIV^ ) unaryExpression )*	;	// level 1 - unary minus, unary plus, notunaryExpression	: MINUS^ {#MINUS.setType(UNARY_MINUS);} unaryExpression	| PLUS^ {#PLUS.setType(UNARY_PLUS);} unaryExpression	| caseExpression	| quantifiedExpression	| atom	;	caseExpression	: CASE^ (whenClause)+ (elseClause)? END! 	| CASE^ { #CASE.setType(CASE2); } unaryExpression (altWhenClause)+ (elseClause)? END!	;	whenClause	: (WHEN^ logicalExpression THEN! unaryExpression)	;	altWhenClause	: (WHEN^ unaryExpression THEN! unaryExpression)	;	elseClause	: (ELSE^ unaryExpression)	;	quantifiedExpression	: ( SOME^ | EXISTS^ | ALL^ | ANY^ ) 	( identifier | collectionExpr | (OPEN! ( subQuery ) CLOSE!) )	;// level 0 - expression atom// ident qualifier ('.' ident ), array index ( [ expr ] ),// method call ( '.' ident '(' exprList ') )atom	 : primaryExpression		(			DOT^ identifier				( options { greedy=true; } :					( op:OPEN^ {#op.setType(METHOD_CALL);} exprList CLOSE! ) )?		|	lb:OPEN_BRACKET^ {#lb.setType(INDEX_OP);} expression CLOSE_BRACKET!		)*	;// level 0 - the basic element of an expressionprimaryExpression	:   identPrimary ( options {greedy=true;} : DOT^ "class" )?	|   constant	|   COLON^ identifier	// TODO: Add parens to the tree so the user can control the operator evaluation order.	|   OPEN! (expressionOrVector | subQuery) CLOSE!	|   PARAM^ (NUM_INT)?	;// This parses normal expression and a list of expressions separated by commas.  If a comma is encountered// a parent VECTOR_EXPR node will be created for the list.expressionOrVector!	: e:expression ( v:vectorExpr )? {		// If this is a vector expression, create a parent node for it.		if (#v != null)			#expressionOrVector = #([VECTOR_EXPR,"{vector}"], #e, #v);		else			#expressionOrVector = #e;	}	;vectorExpr	: COMMA! expression (COMMA! expression)*	;// identifier, followed by member refs (dot ident), or method calls.// NOTE: handleDotIdent() is called immediately after the first IDENT is recognized because// the method looks a head to find keywords after DOT and turns them into identifiers.identPrimary	: identifier { handleDotIdent(); }			( options { greedy=true; } : DOT^ ( identifier | ELEMENTS | o:OBJECT { #o.setType(IDENT); } ) )*			( options { greedy=true; } :				( op:OPEN^ { #op.setType(METHOD_CALL);} exprList CLOSE! )			)?	// Also allow special 'aggregate functions' such as count(), avg(), etc.	| aggregate	;//## aggregate://##     ( aggregateFunction OPEN path CLOSE ) | ( COUNT OPEN STAR CLOSE ) | ( COUNT OPEN (DISTINCT | ALL) path CLOSE );//## aggregateFunction://##     COUNT | 'sum' | 'avg' | 'max' | 'min';aggregate	: ( SUM^ | AVG^ | MAX^ | MIN^ ) OPEN! additiveExpression CLOSE! { #aggregate.setType(AGGREGATE); }	// Special case for count - It's 'parameters' can be keywords.	|  COUNT^ OPEN! ( STAR { #STAR.setType(ROW_STAR); } | ( ( DISTINCT | ALL )? ( path | collectionExpr ) ) ) CLOSE!	|  collectionExpr	;//## collection: ( OPEN query CLOSE ) | ( 'elements'|'indices' OPEN path CLOSE );collectionExpr	: (ELEMENTS^ | INDICES^) OPEN! path CLOSE!	;                                           // NOTE: compoundExpr can be a 'path' where the last token in the path is '.elements' or '.indicies'compoundExpr	: collectionExpr	| path	| (OPEN! ( (expression (COMMA! expression)*) | subQuery ) CLOSE!)	;subQuery	: union	{ #subQuery = #([QUERY,"query"], #subQuery); }	;exprList{   AST trimSpec = null;}	: (t:TRAILING {#trimSpec = #t;} | l:LEADING {#trimSpec = #l;} | b:BOTH {#trimSpec = #b;})?	  		{ if(#trimSpec != null) #trimSpec.setType(IDENT); }	  ( 	  		expression ( (COMMA! expression)+ | FROM { #FROM.setType(IDENT); } expression | AS! identifier )? 	  		| FROM { #FROM.setType(IDENT); } expression	  )?			{ #exprList = #([EXPR_LIST,"exprList"], #exprList); }	;constant	: NUM_INT	| NUM_FLOAT	| NUM_LONG	| NUM_DOUBLE	| QUOTED_STRING	| NULL	| TRUE	| FALSE	| EMPTY	;//## quantifiedExpression: 'exists' | ( expression 'in' ) | ( expression OP 'any' | 'some' ) collection;//## compoundPath: path ( OPEN_BRACKET expression CLOSE_BRACKET ( '.' path )? )*;//## path: identifier ( '.' identifier )*;path	: identifier ( DOT^ { weakKeywords(); } identifier )*	;// Wraps the IDENT token from the lexer, in order to provide// 'keyword as identifier' trickery.identifier	: IDENT	exception	catch [RecognitionException ex]	{		identifier_AST = handleIdentifierError(LT(1),ex);	}	;// **** LEXER ******************************************************************/** * Hibernate Query Language Lexer * <br> * This lexer provides the HQL parser with tokens. * @author Joshua Davis (pgmjsd@sourceforge.net) */class HqlBaseLexer extends Lexer;options {	exportVocab=Hql;      // call the vocabulary "Hql"	testLiterals = false;	k=2; // needed for newline, and to distinguish '>' from '>='.	// HHH-241 : Quoted strings don't allow unicode chars - This should fix it.	charVocabulary='\u0000'..'\uFFFE';	// Allow any char but \uFFFF (16 bit -1, ANTLR's EOF character)	caseSensitive = false;	caseSensitiveLiterals = false;}// -- Declarations --{	// NOTE: The real implementations are in the subclass.	protected void setPossibleID(boolean possibleID) {}}// -- Keywords --EQ: '=';LT: '<';GT: '>';SQL_NE: "<>";NE: "!=" | "^=";LE: "<=";GE: ">=";COMMA: ',';OPEN: '(';CLOSE: ')';OPEN_BRACKET: '[';CLOSE_BRACKET: ']';CONCAT: "||";PLUS: '+';MINUS: '-';STAR: '*';DIV: '/';COLON: ':';PARAM: '?';IDENT options { testLiterals=true; }	: ID_START_LETTER ( ID_LETTER )*		{    		// Setting this flag allows the grammar to use keywords as identifiers, if necessary.			setPossibleID(true);		}	;protectedID_START_LETTER    :    '_'    |    '$'    |    'a'..'z'    |    '\u0080'..'\ufffe'       // HHH-558 : Allow unicode chars in identifiers    ;protectedID_LETTER    :    ID_START_LETTER    |    '0'..'9'    ;QUOTED_STRING	  : '\'' ( (ESCqs)=> ESCqs | ~'\'' )* '\''	;protectedESCqs	:		'\'' '\''	;WS  :   (   ' '		|   '\t'		|   '\r' '\n' { newline(); }		|   '\n'      { newline(); }		|   '\r'      { newline(); }		)		{$setType(Token.SKIP);} //ignore this token	;//--- From the Java example grammar ---// a numeric literalNUM_INT	{boolean isDecimal=false; Token t=null;}	:   '.' {_ttype = DOT;}			(	('0'..'9')+ (EXPONENT)? (f1:FLOAT_SUFFIX {t=f1;})?				{					if (t != null && t.getText().toUpperCase().indexOf('F')>=0)					{						_ttype = NUM_FLOAT;					}					else					{						_ttype = NUM_DOUBLE; // assume double					}				}			)?	|	(	'0' {isDecimal = true;} // special case for just '0'			(	('x')				(											// hex					// the 'e'|'E' and float suffix stuff look					// like hex digits, hence the (...)+ doesn't					// know when to stop: ambig.  ANTLR resolves					// it correctly by matching immediately.  It					// is therefore ok to hush warning.					options { warnWhenFollowAmbig=false; }				:	HEX_DIGIT				)+			|	('0'..'7')+									// octal			)?		|	('1'..'9') ('0'..'9')*  {isDecimal=true;}		// non-zero decimal		)		(	('l') { _ttype = NUM_LONG; }		// only check to see if it's a float if looks like decimal so far		|	{isDecimal}?			(   '.' ('0'..'9')* (EXPONENT)? (f2:FLOAT_SUFFIX {t=f2;})?			|   EXPONENT (f3:FLOAT_SUFFIX {t=f3;})?			|   f4:FLOAT_SUFFIX {t=f4;}			)			{				if (t != null && t.getText().toUpperCase() .indexOf('F') >= 0)				{					_ttype = NUM_FLOAT;				}				else				{					_ttype = NUM_DOUBLE; // assume double				}			}		)?	;// hexadecimal digit (again, note it's protected!)protectedHEX_DIGIT	:	('0'..'9'|'a'..'f')	;// a couple protected methods to assist in matching floating point numbersprotectedEXPONENT	:	('e') ('+'|'-')? ('0'..'9')+	;protectedFLOAT_SUFFIX	:	'f'|'d'	;

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
视频一区在线视频| 久久婷婷综合激情| 国产盗摄视频一区二区三区| 免费在线一区观看| 麻豆国产欧美日韩综合精品二区| 午夜激情久久久| 日本一道高清亚洲日美韩| 一区二区三区四区乱视频| 一色屋精品亚洲香蕉网站| 1024精品合集| 亚洲国产精品一区二区久久 | 亚洲精品一二三四区| 中文字幕中文字幕在线一区 | 1000精品久久久久久久久| 久久蜜臀精品av| 国产精品成人一区二区三区夜夜夜 | 色视频成人在线观看免| 色婷婷av一区二区三区之一色屋| 在线一区二区视频| 欧美日韩国产成人在线免费| 精品三级在线看| 国产精品人成在线观看免费| 亚洲日本va在线观看| 五月综合激情日本mⅴ| 国产一区在线视频| www.视频一区| 91精品久久久久久久91蜜桃| 久久九九久精品国产免费直播| 亚洲视频电影在线| 日韩福利视频导航| 国产成人亚洲综合a∨猫咪| 色丁香久综合在线久综合在线观看| 3d动漫精品啪啪一区二区竹菊| 久久精品亚洲一区二区三区浴池 | 男男视频亚洲欧美| 91视频一区二区三区| 日韩一区二区三区视频在线| 国产丝袜美腿一区二区三区| 午夜日韩在线电影| 久久国产欧美日韩精品| 一本高清dvd不卡在线观看| 884aa四虎影成人精品一区| 久久欧美一区二区| 水野朝阳av一区二区三区| 国产不卡免费视频| 6080国产精品一区二区| 国产精品久久久久毛片软件| 久久精品国产亚洲高清剧情介绍| 欧美亚洲综合另类| 欧美国产成人精品| 久久狠狠亚洲综合| 欧美精品久久天天躁| 亚洲女性喷水在线观看一区| 国产精品自拍毛片| 亚洲精品在线三区| 天天av天天翘天天综合网| 一本大道综合伊人精品热热| 久久久不卡网国产精品一区| 免费日本视频一区| 欧美丰满少妇xxxbbb| 亚洲一区在线视频| 99国产精品视频免费观看| 久久久亚洲精华液精华液精华液| 丝袜诱惑亚洲看片| 欧美肥妇毛茸茸| 亚瑟在线精品视频| 91同城在线观看| 中文字幕第一区综合| 国产馆精品极品| 精品久久一区二区| 视频一区在线视频| 亚洲欧洲另类国产综合| 欧美激情一区二区三区在线| 欧美美女视频在线观看| 欧美大片一区二区| 另类小说综合欧美亚洲| 91精品国产综合久久久久久漫画| 一二三四区精品视频| 欧美在线视频你懂得| 亚洲国产cao| 欧美日韩高清一区二区三区| 一区二区三区日韩欧美| 欧美亚洲高清一区| 日韩精品亚洲一区| 日韩一区二区精品| 国产美女在线精品| 欧美激情一区三区| 99精品国产一区二区三区不卡| 日韩理论片网站| 欧美体内she精高潮| 五月天国产精品| 欧美精品一区二区精品网| 精品在线观看免费| 中文字幕av一区二区三区高| 97久久超碰国产精品| 亚洲三级小视频| 欧美精品 国产精品| 麻豆精品久久久| 国产精品色一区二区三区| 在线国产电影不卡| 九九九久久久精品| 亚洲欧洲日韩在线| 91精品综合久久久久久| 国产美女精品一区二区三区| 国产精品伦一区二区三级视频| 在线观看欧美黄色| 国产麻豆视频一区二区| 亚洲日本在线视频观看| 日韩欧美综合在线| 成人免费高清在线| 毛片基地黄久久久久久天堂| 中文字幕不卡三区| 欧美高清视频一二三区| 国产精品资源在线看| 五月天亚洲婷婷| 中文在线免费一区三区高中清不卡| 欧美婷婷六月丁香综合色| 国产一区二区三区视频在线播放| 亚洲精品v日韩精品| 亚洲精品在线三区| 7777精品伊人久久久大香线蕉超级流畅 | 97久久超碰国产精品| 日本va欧美va精品发布| 成人欧美一区二区三区黑人麻豆 | 99久久精品免费看国产免费软件| 婷婷成人激情在线网| 国产精品盗摄一区二区三区| 精品国产网站在线观看| 欧美午夜寂寞影院| 成人激情免费电影网址| 日韩经典中文字幕一区| 亚洲黄一区二区三区| 国产欧美日韩综合| 欧美精品一区二区高清在线观看 | 国产精品久久久久影视| 日韩欧美卡一卡二| 欧美日本国产一区| 色老汉一区二区三区| 99国产欧美久久久精品| 国产精品夜夜嗨| 久久99精品国产| 久久99精品国产麻豆婷婷| 美女视频黄久久| 日本成人中文字幕在线视频| 亚洲国产精品久久一线不卡| 亚洲一区二区中文在线| 亚洲欧美偷拍卡通变态| 亚洲欧美激情视频在线观看一区二区三区 | 色天天综合色天天久久| 国产aⅴ综合色| 国产成人h网站| 成人亚洲精品久久久久软件| 国产精品乡下勾搭老头1| 国产激情一区二区三区四区| 国产成人精品亚洲777人妖| 高清beeg欧美| 91黄色免费观看| 欧美群妇大交群的观看方式| 91精品免费在线| 欧美精品一区二区三区在线播放| 久久综合狠狠综合久久激情 | 亚洲一二三区在线观看| 亚洲午夜视频在线观看| 日韩av在线发布| 久久成人免费电影| 成人高清视频在线| 日本乱人伦一区| 欧美日韩久久久| 精品三级在线观看| 国产精品麻豆99久久久久久| 亚洲精品视频一区| 天天综合色天天综合色h| 国产在线不卡视频| thepron国产精品| 欧美视频精品在线观看| 精品国产一区二区三区不卡| 国产精品久久久久一区| 婷婷夜色潮精品综合在线| 国产精品自在在线| 在线观看国产日韩| 精品捆绑美女sm三区| 中文字幕在线观看一区| 天天操天天综合网| 国产成人综合在线| 欧美日本乱大交xxxxx| 国产日韩视频一区二区三区| 亚洲综合在线五月| 婷婷久久综合九色综合绿巨人| 国产精品一区二区久久不卡| 91久久一区二区| 国产亚洲视频系列| 五月天一区二区| 91在线视频在线| 久久免费电影网| 日本在线观看不卡视频| 91网站在线播放| 久久亚洲一区二区三区四区| 亚洲成人自拍网| 一本久道中文字幕精品亚洲嫩| 精品久久久久久无|