亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? hql.g

?? hibernate 開源框架的代碼 jar包希望大家能喜歡
?? G
?? 第 1 頁 / 共 2 頁
字號:
//             ( 4)  relational: <, <=, >, >=,//                   LIKE, NOT LIKE, BETWEEN, NOT BETWEEN, IN, NOT IN//             ( 3)  addition and subtraction: +(binary) -(binary)//             ( 2)  multiplication: * / %, concatenate: ||// highest --> ( 1)  +(unary) -(unary)//                   []   () (method call)  . (dot -- identifier qualification)//                   aggregate function//                   ()  (explicit parenthesis)//// Note that the above precedence levels map to the rules below...// Once you have a precedence chart, writing the appropriate rules as below// is usually very straightfowardlogicalExpression	: expression	;// Main expression ruleexpression	: logicalOrExpression	;// level 7 - ORlogicalOrExpression	: logicalAndExpression ( OR^ logicalAndExpression )*	;// level 6 - AND, NOTlogicalAndExpression	: negatedExpression ( AND^ negatedExpression )*	;// NOT nodes aren't generated.  Instead, the operator in the sub-tree will be// negated, if possible.   Expressions without a NOT parent are passed through.negatedExpression!{ weakKeywords(); } // Weak keywords can appear in an expression, so look ahead.	: NOT^ x:negatedExpression { #negatedExpression = negateNode(#x); }	| y:equalityExpression { #negatedExpression = #y; }	;//## OP: EQ | LT | GT | LE | GE | NE | SQL_NE | LIKE;// level 5 - EQ, NEequalityExpression	: x:relationalExpression (		( EQ^		| is:IS^	{ #is.setType(EQ); } (NOT! { #is.setType(NE); } )?		| NE^		| ne:SQL_NE^	{ #ne.setType(NE); }		) y:relationalExpression)* {			// Post process the equality expression to clean up 'is null', etc.			#equalityExpression = processEqualityExpression(#equalityExpression);		}	;// level 4 - LT, GT, LE, GE, LIKE, NOT LIKE, BETWEEN, NOT BETWEEN// NOTE: The NOT prefix for LIKE and BETWEEN will be represented in the// token type.  When traversing the AST, use the token type, and not the// token text to interpret the semantics of these nodes.relationalExpression	: concatenation (		( ( ( LT^ | GT^ | LE^ | GE^ ) additiveExpression )* )		// Disable node production for the optional 'not'.		| (n:NOT!)? (			// Represent the optional NOT prefix using the token type by			// testing 'n' and setting the token type accordingly.			(i:IN^ {					#i.setType( (n == null) ? IN : NOT_IN);					#i.setText( (n == null) ? "in" : "not in");				}				inList)			| (b:BETWEEN^ {					#b.setType( (n == null) ? BETWEEN : NOT_BETWEEN);					#b.setText( (n == null) ? "between" : "not between");				}				betweenList )			| (l:LIKE^ {					#l.setType( (n == null) ? LIKE : NOT_LIKE);					#l.setText( (n == null) ? "like" : "not like");				}				concatenation likeEscape)			| (MEMBER! (OF!)? p:path! {				processMemberOf(n,#p,currentAST);			  } ) )		)	;likeEscape	: (ESCAPE^ concatenation)?	;inList	: x:compoundExpr	{ #inList = #([IN_LIST,"inList"], #inList); }	;betweenList	: concatenation AND! concatenation	;//level 4 - string concatenationconcatenation	: additiveExpression 	( c:CONCAT^ { #c.setType(EXPR_LIST); #c.setText("concatList"); } 	  additiveExpression	  ( CONCAT! additiveExpression )* 	  { #concatenation = #([METHOD_CALL, "||"], #([IDENT, "concat"]), #c ); } )?	;// level 3 - binary plus and minusadditiveExpression	: multiplyExpression ( ( PLUS^ | MINUS^ ) multiplyExpression )*	;// level 2 - binary multiply and dividemultiplyExpression	: unaryExpression ( ( STAR^ | DIV^ ) unaryExpression )*	;	// level 1 - unary minus, unary plus, notunaryExpression	: MINUS^ {#MINUS.setType(UNARY_MINUS);} unaryExpression	| PLUS^ {#PLUS.setType(UNARY_PLUS);} unaryExpression	| caseExpression	| quantifiedExpression	| atom	;	caseExpression	: CASE^ (whenClause)+ (elseClause)? END! 	| CASE^ { #CASE.setType(CASE2); } unaryExpression (altWhenClause)+ (elseClause)? END!	;	whenClause	: (WHEN^ logicalExpression THEN! unaryExpression)	;	altWhenClause	: (WHEN^ unaryExpression THEN! unaryExpression)	;	elseClause	: (ELSE^ unaryExpression)	;	quantifiedExpression	: ( SOME^ | EXISTS^ | ALL^ | ANY^ ) 	( identifier | collectionExpr | (OPEN! ( subQuery ) CLOSE!) )	;// level 0 - expression atom// ident qualifier ('.' ident ), array index ( [ expr ] ),// method call ( '.' ident '(' exprList ') )atom	 : primaryExpression		(			DOT^ identifier				( options { greedy=true; } :					( op:OPEN^ {#op.setType(METHOD_CALL);} exprList CLOSE! ) )?		|	lb:OPEN_BRACKET^ {#lb.setType(INDEX_OP);} expression CLOSE_BRACKET!		)*	;// level 0 - the basic element of an expressionprimaryExpression	:   identPrimary ( options {greedy=true;} : DOT^ "class" )?	|   constant	|   COLON^ identifier	// TODO: Add parens to the tree so the user can control the operator evaluation order.	|   OPEN! (expressionOrVector | subQuery) CLOSE!	|   PARAM^ (NUM_INT)?	;// This parses normal expression and a list of expressions separated by commas.  If a comma is encountered// a parent VECTOR_EXPR node will be created for the list.expressionOrVector!	: e:expression ( v:vectorExpr )? {		// If this is a vector expression, create a parent node for it.		if (#v != null)			#expressionOrVector = #([VECTOR_EXPR,"{vector}"], #e, #v);		else			#expressionOrVector = #e;	}	;vectorExpr	: COMMA! expression (COMMA! expression)*	;// identifier, followed by member refs (dot ident), or method calls.// NOTE: handleDotIdent() is called immediately after the first IDENT is recognized because// the method looks a head to find keywords after DOT and turns them into identifiers.identPrimary	: identifier { handleDotIdent(); }			( options { greedy=true; } : DOT^ ( identifier | ELEMENTS | o:OBJECT { #o.setType(IDENT); } ) )*			( options { greedy=true; } :				( op:OPEN^ { #op.setType(METHOD_CALL);} exprList CLOSE! )			)?	// Also allow special 'aggregate functions' such as count(), avg(), etc.	| aggregate	;//## aggregate://##     ( aggregateFunction OPEN path CLOSE ) | ( COUNT OPEN STAR CLOSE ) | ( COUNT OPEN (DISTINCT | ALL) path CLOSE );//## aggregateFunction://##     COUNT | 'sum' | 'avg' | 'max' | 'min';aggregate	: ( SUM^ | AVG^ | MAX^ | MIN^ ) OPEN! additiveExpression CLOSE! { #aggregate.setType(AGGREGATE); }	// Special case for count - It's 'parameters' can be keywords.	|  COUNT^ OPEN! ( STAR { #STAR.setType(ROW_STAR); } | ( ( DISTINCT | ALL )? ( path | collectionExpr ) ) ) CLOSE!	|  collectionExpr	;//## collection: ( OPEN query CLOSE ) | ( 'elements'|'indices' OPEN path CLOSE );collectionExpr	: (ELEMENTS^ | INDICES^) OPEN! path CLOSE!	;                                           // NOTE: compoundExpr can be a 'path' where the last token in the path is '.elements' or '.indicies'compoundExpr	: collectionExpr	| path	| (OPEN! ( (expression (COMMA! expression)*) | subQuery ) CLOSE!)	;subQuery	: union	{ #subQuery = #([QUERY,"query"], #subQuery); }	;exprList{   AST trimSpec = null;}	: (t:TRAILING {#trimSpec = #t;} | l:LEADING {#trimSpec = #l;} | b:BOTH {#trimSpec = #b;})?	  		{ if(#trimSpec != null) #trimSpec.setType(IDENT); }	  ( 	  		expression ( (COMMA! expression)+ | FROM { #FROM.setType(IDENT); } expression | AS! identifier )? 	  		| FROM { #FROM.setType(IDENT); } expression	  )?			{ #exprList = #([EXPR_LIST,"exprList"], #exprList); }	;constant	: NUM_INT	| NUM_FLOAT	| NUM_LONG	| NUM_DOUBLE	| QUOTED_STRING	| NULL	| TRUE	| FALSE	| EMPTY	;//## quantifiedExpression: 'exists' | ( expression 'in' ) | ( expression OP 'any' | 'some' ) collection;//## compoundPath: path ( OPEN_BRACKET expression CLOSE_BRACKET ( '.' path )? )*;//## path: identifier ( '.' identifier )*;path	: identifier ( DOT^ { weakKeywords(); } identifier )*	;// Wraps the IDENT token from the lexer, in order to provide// 'keyword as identifier' trickery.identifier	: IDENT	exception	catch [RecognitionException ex]	{		identifier_AST = handleIdentifierError(LT(1),ex);	}	;// **** LEXER ******************************************************************/** * Hibernate Query Language Lexer * <br> * This lexer provides the HQL parser with tokens. * @author Joshua Davis (pgmjsd@sourceforge.net) */class HqlBaseLexer extends Lexer;options {	exportVocab=Hql;      // call the vocabulary "Hql"	testLiterals = false;	k=2; // needed for newline, and to distinguish '>' from '>='.	// HHH-241 : Quoted strings don't allow unicode chars - This should fix it.	charVocabulary='\u0000'..'\uFFFE';	// Allow any char but \uFFFF (16 bit -1, ANTLR's EOF character)	caseSensitive = false;	caseSensitiveLiterals = false;}// -- Declarations --{	// NOTE: The real implementations are in the subclass.	protected void setPossibleID(boolean possibleID) {}}// -- Keywords --EQ: '=';LT: '<';GT: '>';SQL_NE: "<>";NE: "!=" | "^=";LE: "<=";GE: ">=";COMMA: ',';OPEN: '(';CLOSE: ')';OPEN_BRACKET: '[';CLOSE_BRACKET: ']';CONCAT: "||";PLUS: '+';MINUS: '-';STAR: '*';DIV: '/';COLON: ':';PARAM: '?';IDENT options { testLiterals=true; }	: ID_START_LETTER ( ID_LETTER )*		{    		// Setting this flag allows the grammar to use keywords as identifiers, if necessary.			setPossibleID(true);		}	;protectedID_START_LETTER    :    '_'    |    '$'    |    'a'..'z'    |    '\u0080'..'\ufffe'       // HHH-558 : Allow unicode chars in identifiers    ;protectedID_LETTER    :    ID_START_LETTER    |    '0'..'9'    ;QUOTED_STRING	  : '\'' ( (ESCqs)=> ESCqs | ~'\'' )* '\''	;protectedESCqs	:		'\'' '\''	;WS  :   (   ' '		|   '\t'		|   '\r' '\n' { newline(); }		|   '\n'      { newline(); }		|   '\r'      { newline(); }		)		{$setType(Token.SKIP);} //ignore this token	;//--- From the Java example grammar ---// a numeric literalNUM_INT	{boolean isDecimal=false; Token t=null;}	:   '.' {_ttype = DOT;}			(	('0'..'9')+ (EXPONENT)? (f1:FLOAT_SUFFIX {t=f1;})?				{					if (t != null && t.getText().toUpperCase().indexOf('F')>=0)					{						_ttype = NUM_FLOAT;					}					else					{						_ttype = NUM_DOUBLE; // assume double					}				}			)?	|	(	'0' {isDecimal = true;} // special case for just '0'			(	('x')				(											// hex					// the 'e'|'E' and float suffix stuff look					// like hex digits, hence the (...)+ doesn't					// know when to stop: ambig.  ANTLR resolves					// it correctly by matching immediately.  It					// is therefore ok to hush warning.					options { warnWhenFollowAmbig=false; }				:	HEX_DIGIT				)+			|	('0'..'7')+									// octal			)?		|	('1'..'9') ('0'..'9')*  {isDecimal=true;}		// non-zero decimal		)		(	('l') { _ttype = NUM_LONG; }		// only check to see if it's a float if looks like decimal so far		|	{isDecimal}?			(   '.' ('0'..'9')* (EXPONENT)? (f2:FLOAT_SUFFIX {t=f2;})?			|   EXPONENT (f3:FLOAT_SUFFIX {t=f3;})?			|   f4:FLOAT_SUFFIX {t=f4;}			)			{				if (t != null && t.getText().toUpperCase() .indexOf('F') >= 0)				{					_ttype = NUM_FLOAT;				}				else				{					_ttype = NUM_DOUBLE; // assume double				}			}		)?	;// hexadecimal digit (again, note it's protected!)protectedHEX_DIGIT	:	('0'..'9'|'a'..'f')	;// a couple protected methods to assist in matching floating point numbersprotectedEXPONENT	:	('e') ('+'|'-')? ('0'..'9')+	;protectedFLOAT_SUFFIX	:	'f'|'d'	;

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
成人动漫av在线| 日本高清不卡aⅴ免费网站| 丰满少妇久久久久久久| 欧美日韩成人一区二区| 国产欧美精品区一区二区三区| 亚洲欧美精品午睡沙发| 国产最新精品免费| 欧美三级电影精品| 亚洲丝袜制服诱惑| 国产精品亚洲视频| 精品国产一区a| 日韩高清一区在线| 在线精品观看国产| 中文字幕av一区 二区| 精品一区二区三区蜜桃| 欧美情侣在线播放| 亚洲一区中文日韩| 97精品久久久久中文字幕| 久久综合九色综合欧美亚洲| 日日夜夜精品视频免费| 欧美色综合天天久久综合精品| 自拍偷拍亚洲综合| 成人免费看片app下载| 久久一夜天堂av一区二区三区| 久久精品国产免费| 91精品欧美综合在线观看最新| 亚洲福利一区二区三区| 在线观看视频一区| 一区二区三区精品在线| 色猫猫国产区一区二在线视频| 国产精品初高中害羞小美女文| 成人福利视频在线| 亚洲女厕所小便bbb| 一本色道久久加勒比精品| 亚洲美女电影在线| 色呦呦一区二区三区| 一区二区三区在线视频免费| 91免费版在线| 亚洲一区二区在线免费看| 亚洲午夜电影网| 欧美午夜在线一二页| 亚洲人成伊人成综合网小说| 99久久777色| 亚洲精品国产精华液| 色素色在线综合| 亚洲成人av一区二区| 日韩情涩欧美日韩视频| 国产一区二区调教| 国产精品美女久久福利网站| 91首页免费视频| 亚洲成a天堂v人片| 日韩美一区二区三区| 国产一区二区看久久| 国产精品国产三级国产aⅴ原创| 色综合色狠狠综合色| 首页欧美精品中文字幕| 精品久久人人做人人爱| 成人av免费网站| 一区二区免费在线| 日韩一级黄色大片| 成人三级在线视频| 精品一区二区三区不卡| 久久九九久精品国产免费直播| 精品一二线国产| 中文字幕精品三区| 欧美特级限制片免费在线观看| 久久国产剧场电影| 亚洲欧洲国产日本综合| 欧美军同video69gay| 狠狠色综合播放一区二区| 中文字幕一区不卡| 91麻豆精品国产综合久久久久久| 国产精品1区二区.| 亚洲小说春色综合另类电影| 久久嫩草精品久久久精品| 色呦呦国产精品| 国产一区 二区 三区一级| 亚洲精品日韩专区silk| 久久免费看少妇高潮| 欧美日韩亚洲高清一区二区| 成人午夜电影网站| 日韩国产欧美三级| 2022国产精品视频| 欧洲国内综合视频| 韩国成人在线视频| 亚洲成人av资源| 日韩美女精品在线| 久久伊99综合婷婷久久伊| 欧美日韩另类国产亚洲欧美一级| 国产成人自拍高清视频在线免费播放 | 成人开心网精品视频| 日本成人在线电影网| 亚洲人精品午夜| 国产精品五月天| 精品国产区一区| 日韩一区二区三区观看| 欧洲视频一区二区| 95精品视频在线| 成人免费视频网站在线观看| 国产揄拍国内精品对白| 蜜桃久久av一区| 日本中文字幕一区| 五月天视频一区| 亚洲第一福利一区| 亚洲国产精品一区二区久久 | 久久国产精品第一页| 午夜精品成人在线视频| 亚洲综合色网站| 亚洲综合精品自拍| 亚洲国产日韩a在线播放| 亚洲男女毛片无遮挡| 中文字幕亚洲在| 亚洲天堂免费看| 亚洲三级视频在线观看| 亚洲男同1069视频| 亚洲自拍与偷拍| 亚洲成年人影院| 日日欢夜夜爽一区| 日韩高清在线观看| 美日韩一区二区| 韩日av一区二区| 国产精品一区二区视频| 国产一区二区三区在线看麻豆| 国产一区二区三区av电影| 国产高清久久久久| 99国产欧美另类久久久精品| 一本色道久久综合精品竹菊| 欧美日韩另类国产亚洲欧美一级| 欧美日韩一级二级三级| 日韩一区二区免费电影| 久久综合色播五月| 亚洲欧洲日韩av| 亚洲成人tv网| 狠狠色丁香九九婷婷综合五月| 国产精品一区二区黑丝| 91麻豆精品秘密| 欧美日韩亚州综合| 2021国产精品久久精品| 中文字幕在线不卡国产视频| 亚洲v中文字幕| 国内外精品视频| 色综合久久久久综合体| 欧美精品一级二级三级| 久久蜜桃av一区二区天堂| 中文字幕av一区二区三区免费看| 亚洲综合丁香婷婷六月香| 久久精品国产99久久6| eeuss鲁片一区二区三区在线观看| 97se亚洲国产综合自在线观| 91精品婷婷国产综合久久| 中文字幕精品一区二区精品绿巨人| 亚洲国产视频a| 国产精品一二三| 欧美日韩五月天| 国产精品伦一区二区三级视频| 亚洲电影一区二区三区| 国产成人综合亚洲网站| 制服丝袜日韩国产| 亚洲欧洲99久久| 激情久久五月天| 欧美日韩精品三区| 日本一区二区三区国色天香 | 色94色欧美sute亚洲线路二| 精品对白一区国产伦| 亚洲一区二区精品3399| 成人h精品动漫一区二区三区| 777午夜精品视频在线播放| 国产精品国产成人国产三级| 日本不卡123| 日本亚洲天堂网| 国精品**一区二区三区在线蜜桃| 欧美三级中文字| 国产片一区二区三区| 蜜臀av性久久久久蜜臀av麻豆| zzijzzij亚洲日本少妇熟睡| 日韩欧美中文字幕公布| 国产精品久久久久久久久晋中 | 国产美女视频一区| 欧美高清一级片在线| 欧美经典一区二区| 秋霞电影一区二区| 国产夫妻精品视频| 日韩欧美在线综合网| 一区二区三区中文在线| 国产激情一区二区三区四区| 日韩精品一区二区三区老鸭窝| 亚洲天堂av一区| 国产成人三级在线观看| 久久美女艺术照精彩视频福利播放| 亚洲va中文字幕| 色综合久久久久综合体| 久久婷婷国产综合国色天香| 韩国v欧美v亚洲v日本v| 欧美一区二区私人影院日本| 亚洲欧美另类久久久精品| 99久久精品免费观看| 亚洲国产精品成人久久综合一区| 蜜乳av一区二区三区| 欧美人体做爰大胆视频| 亚洲精品视频在线观看网站|