Ñò Žf¯Jc@s÷dZddkZyeWn#ej oddklZnXddklZlZddk l Z ddk l Z l Z lZlZddklZlZlZlZdd d d d d ddddg Zed„ƒZdefd„ƒYZdefd„ƒYZd efd„ƒYZd efd„ƒYZdefd„ƒYZ defd„ƒYZ!d„Z"defd„ƒYZ#e#ƒZ$d„Z%d efd!„ƒYZ&d efd"„ƒYZ'd efd#„ƒYZ(d e'fd$„ƒYZ)d%„Z*dS(&s´ pygments.lexer ~~~~~~~~~~~~~~ Base lexer classes. :copyright: Copyright 2006-2009 by the Pygments team, see AUTHORS. :license: BSD, see LICENSE for details. iÿÿÿÿN(tSet(t apply_filterstFilter(tget_filter_by_name(tErrortTexttOthert _TokenType(t get_bool_optt get_int_optt get_list_opttmake_analysatortLexert RegexLexertExtendedRegexLexertDelegatingLexert LexerContexttincludetflagstbygroupstusingtthiscCsdS(g((tx((s2/usr/lib/python2.6/site-packages/pygments/lexer.pytst LexerMetacBseZdZd„ZRS(s‚ This metaclass automagically converts ``analyse_text`` methods into static methods which always return float values. cCs;d|jot|dƒ|ds(R)t __class__R (R.((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyt__repr__\s   cKs7t|tƒpt||}n|ii|ƒdS(s8 Add a new stream filter to this lexer. N(t isinstanceRRR(tappend(R.R/R)((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyR-cscCsdS(s~ Has to return a float between ``0`` and ``1`` that indicates if a lexer wants to highlight this text. Used by ``guess_lexer``. If this method returns ``0`` it won't highlight it in any case, if it returns ``1`` highlighting with this lexer is guaranteed. The `LexerMeta` metaclass automatically wraps this function so that it works like a static method (no ``self`` or ``cls`` parameter) and the return value is automatically converted to `float`. If the return value is an object that is boolean `False` it's the same as if the return values was ``0.0``. N((ttext((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyRksc sÜtˆtƒpîˆidjo`y7ˆidƒ‰ˆidƒoˆtdƒ‰nWqútj oˆidƒ‰qúXqþˆidjoXyddk}Wntj otdƒ‚nX|i ˆƒ}ˆi|d ƒ‰qþˆiˆiƒ‰nˆi d d ƒ‰ˆi d d ƒ‰ˆi oˆi ƒ‰nˆi oˆi d ƒ‰nˆid joˆiˆiƒ‰nˆid ƒpˆd 7‰n‡‡fd†}|ƒ}|pt|ˆiˆƒ}n|S(s= Return an iterable of (tokentype, value) pairs generated from `text`. If `unfiltered` is set to `True`, the filtering mechanism is bypassed even if filters are defined. Also preprocess the text, i.e. expand tabs and strip it if wanted and applies registered filters. tguesssutf-8uR'tchardetiÿÿÿÿNskTo enable chardet encoding guessing, please install the chardet library from http://chardet.feedparser.org/R&s s s ic3s2x+ˆiˆƒD]\}}}||fVqWdS(N(tget_tokens_unprocessed(titttv(R5R.(s2/usr/lib/python2.6/site-packages/pygments/lexer.pytstreamer¡s(R3tunicodeR&tdecodet startswithtlentUnicodeDecodeErrorR7t ImportErrortdetecttreplaceR$tstripR#R%t expandtabstendswithRR((R.R5t unfilteredR7tencR<tstream((R5R.s2/usr/lib/python2.6/site-packages/pygments/lexer.pyt get_tokensys>    cCs t‚dS(s  Return an iterable of (tokentype, value) pairs. In subclasses, implement this method as a generator to maximize effectiveness. N(tNotImplementedError(R.R5((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyR8©sN(R R!R"tNoneRtaliasest filenamestalias_filenamest mimetypesRt __metaclass__R0R2R-RR+RKR8(((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyR ,s    0cBs#eZdZed„Zd„ZRS(s  This lexer takes two lexer as arguments. A root lexer and a language lexer. First everything is scanned using the language lexer, afterwards all ``Other`` tokens are lexed using the root lexer. The lexers from the ``template`` lexer package use this base lexer. cKs;|||_|||_||_ti||dS(N(t root_lexertlanguage_lexertneedleR R0(R.t _root_lexert_language_lexert_needleR)((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyR0¼s cCsÓd}g}g}x|ii|ƒD]m\}}}||ijo8|o#|it|ƒ|fƒg}n||7}q%|i|||fƒq%W|o|it|ƒ|fƒnt||ii|ƒƒS(Nt(RTR8RUR4R@t do_insertionsRS(R.R5tbufferedt insertionst lng_bufferR9R:R;((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyR8Âs (R R!R"RR0R8(((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyR²s cBseZdZRS(sI Indicates that a state should include rules from another state. (R R!R"(((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyRÙstcombinedcBs eZdZd„Zd„ZRS(s: Indicates a state combined from multiple states. cGsti||ƒS(N(ttupleR(Rtargs((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyRåscGsdS(N((R.R`((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyR0ès(R R!R"RR0(((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyR^às t _PseudoMatchcBsMeZdZd„Zdd„Zdd„Zdd„Zd„Zd„Z RS(s: A pseudo match object constructed from a string. cCs||_||_dS(N(t_textt_start(R.tstartR5((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyR0òs cCs|iS(N(Rc(R.targ((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyRdöscCs|it|iƒS(N(RcR@Rb(R.Re((s2/usr/lib/python2.6/site-packages/pygments/lexer.pytendùscCs|otdƒ‚n|iS(Ns No such group(t IndexErrorRb(R.Re((s2/usr/lib/python2.6/site-packages/pygments/lexer.pytgroupüscCs |ifS(N(Rb(R.((s2/usr/lib/python2.6/site-packages/pygments/lexer.pytgroupsscCshS(N((R.((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyt groupdictsN( R R!R"R0RMRdRfRhRiRj(((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyRaís     csd‡fd†}|S(sL Callback that yields multiple actions for each group in the match. c3sxïtˆƒD]á\}}|djoq q t|ƒtjo=|i|dƒ}|o|i|dƒ||fVqîq |o|i|dƒ|_nxM||t|i|dƒ|i|dƒƒ|ƒD]}|o |VqÔqÔWq W|o|iƒ|_ndS(Ni( t enumerateRMRRRhRdtposRaRf(tlexertmatchtctxR9tactiontdatatitem(R`(s2/usr/lib/python2.6/site-packages/pygments/lexer.pytcallback s"  #N(RM(R`Rs((R`s2/usr/lib/python2.6/site-packages/pygments/lexer.pyRst_ThiscBseZdZRS(sX Special singleton used for indicating the caller class. Used by ``using``. (R R!R"(((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyRt sc sœh‰dˆjoGˆidƒ}t|ttfƒo|ˆds c3s„ˆi|iƒˆˆ}|iƒ}x;|i|iƒˆD]!\}}}||||fVqAW|o|iƒ|_ndS(N(RxR)RdR8RhRfRl(RmRnRoRyRzR9R:R;(R{R|t_other(s2/usr/lib/python2.6/site-packages/pygments/lexer.pyRsMs  N(tpopR3tlistR_RRM(R}R|RzRs((R{R|R}s2/usr/lib/python2.6/site-packages/pygments/lexer.pyR(s    tRegexLexerMetacBs,eZdZd„Zdd„Zd„ZRS(sw Metaclass for RegexLexer, creates the self._tokens attribute from self.tokens on the first instantiation. c Cs^t|ƒtjptd|‚|ddjptd|‚||jo ||Sg}||<|i}xê||D]Þ}t|tƒoD||jptd|‚|i|i||t|ƒƒƒqxnt|ƒtjptd|‚yt i |d|ƒi }Wn5t j o)}t d|d|||fƒ‚nXt|dƒtjp&t|dƒptd |df‚t|ƒd jo d} n›|d } t| tƒo‡| d jo d } q<| |jo | f} q<| d jo | } q<| d djot| dƒ } q<tptd| ‚nút| tƒod|i} |id7_g} xE| D]=} | |jptd| ‚| i|i||| ƒƒq|W| || <| f} nit| tƒoCx6| D].}||jp|djptd|‚qëW| } ntptd| ‚|i||d| fƒqxW|S(Nswrong state name %rit#sinvalid state name %rscircular state reference %rswrong rule def %rs+uncompilable regex %r in state %r of %r: %sis2token type must be simple type or callable, not %ris#popiÿÿÿÿs#pushis#pop:sunknown new state %rs_tmp_%dscircular state ref %rsunknown new state sunknown new state def %r(s#pops#push(RtstrtAssertionErrorRR3Rtextendt_process_stateR_tretcompileRnt Exceptiont ValueErrorRtcallableR@RMtintR+R^t_tmpnameR4( Rt unprocessedt processedRuttokenstrflagsttdeftrexterrt new_statettdef2titokenstistate((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyR…`sl!    "!+             cCsSh}|i|<|p |i|}x'|iƒD]}|i|||ƒq2W|S(N(t _all_tokensRtkeysR…(RRt tokendefsRŽRu((s2/usr/lib/python2.6/site-packages/pygments/lexer.pytprocess_tokendef s  cOsot|dƒpLh|_d|_t|dƒo|ioq\|id|iƒ|_nti|||ŽS(Nt_tokensittoken_variantsRY( thasattrR˜RŒRR›RRœRt__call__(RR`tkwds((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyRŸ§s  N(R R!R"R…RMR›RŸ(((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyR€Zs @ cBs/eZdZeZeiZhZdd„Z RS(s± Base for simple stateful regular expression-based lexers. Simplifies the lexing process so that you need only provide a list of states and regular expressions. Rwc csd}|i}t|ƒ}||d}xèxâ|D]`\}}} |||ƒ} | o;t|ƒtjo||| iƒfVn x||| ƒD] } | Vq“W| iƒ}| d j oÓt| tƒo_x®| D]P} | djo|i ƒqÒ| djo|i |dƒqÒ|i | ƒqÒWnSt| t ƒo || 3n8| djo|i |dƒnt pt d| ‚||d}nPq3q3Wyc||djo2|d7}dg}|d}|td fVw,n|t||fV|d7}Wq,tj oPq,Xq,d S( s} Split ``text`` into (tokentype, text) pairs. ``stack`` is the inital stack (default: ``['root']``) iiÿÿÿÿs#pops#pushswrong state def: %rs iRwu N(RœRRRRhRfRMR3R_R~R4R‹R+RƒRRRg( R.R5RvRlRšt statestackt statetokenstrexmatchRpR”tmRrRu((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyR8ÓsT             (sroot( R R!R"R€RRR†t MULTILINERRR8(((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyR ´s  cBs&eZdZddd„Zd„ZRS(s9 A helper object that holds lexer position data. cCs?||_||_|p t|ƒ|_|pdg|_dS(NRw(R5RlR@RfRv(R.R5RlRvRf((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyR0s  cCsd|i|i|ifS(NsLexerContext(%r, %r, %r)(R5RlRv(R.((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyR2sN(R R!R"RMR0R2(((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyR scBseZdZddd„ZRS(sE A RegexLexer that uses a context object to store its state. c cs\|i}|pt|dƒ}|d}n!|}||id}|i}xx|D]O\}}}|||i|iƒ} | o!t|ƒtjo*|i|| iƒfV| iƒ|_n?x||| |ƒD] } | VqÕW|p||id}n|d j o”t |t ƒo|ii |ƒn\t |t ƒo|i|3n>|djo|ii|idƒntptd|‚||id}nPqWqWWy”|i|ijoPn||idjo=|id7_dg|_|d}|itdfVwPn|it||ifV|id7_WqPtj oPqPXqPd S( s Split ``text`` into (tokentype, text) pairs. If ``context`` is given, use this lexer context instead. iRwiÿÿÿÿs#pushswrong state def: %rs iu N(RœRRvR5RlRfRRRhRMR3R_R„R‹R4R+RƒRRRg( R.R5tcontextRšRoR¢R£RpR”R¤Rr((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyR8sV        N(R R!R"RMR8(((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyRsc csÖt|ƒ}y|iƒ\}}Wn*tj ox|D] }|Vq:WdSXd}t}x$|D]\}}} |djo |}nd} xÇ|o¿|t| ƒ|jo¨| | ||!} ||| fV|t| ƒ7}x5|D]-\} } }|| |fV|t|ƒ7}qåW||} y|iƒ\}}Wq‘tj ot}Pq‘Xq‘W||| | fV|t| ƒ| 7}qbW|oI|pd}x9|D]-\}}} ||| fV|t| ƒ7}qWndS(sg Helper for lexers which must combine the results of several sublexers. ``insertions`` is a list of ``(index, itokens)`` pairs. Each ``itokens`` iterable should be inserted at position ``index`` into the token stream given by the ``tokens`` argument. The result is a combined token stream. TODO: clean up the code here. Ni(titertnextt StopIterationRMR*R@R+(R\RtindexR–RrtrealpostinsleftR9R:R;toldittmpvaltit_indextit_tokentit_valuetp((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyRZWsL       (+R"R†tsett NameErrortsetsRtpygments.filterRRtpygments.filtersRtpygments.tokenRRRRt pygments.utilRR R R t__all__t staticmethodt_default_analyseRRtobjectR RR‚RR_R^RaRRtRRR€R RRRZ(((s2/usr/lib/python2.6/site-packages/pygments/lexer.pyt s6 ""  †'    2ZU>