MediaWiki  REL1_22
cssjanus.Tokenizer Class Reference

List of all members.

Public Member Functions

def __init__
def DeTokenize
def Tokenize
def TokenizeMatches

Public Attributes

 originals
 token_re
 token_string

Detailed Description

Replaces any CSS comments with string tokens and vice versa.

Definition at line 205 of file cssjanus.py.


Constructor & Destructor Documentation

def cssjanus.Tokenizer.__init__ (   self,
  token_re,
  token_string 
)
Constructor for the Tokenizer.

Args:
  token_re: A regex for the string to be replace by a token.
  token_string: The string to put between token delimiters when tokenizing.

Definition at line 208 of file cssjanus.py.


Member Function Documentation

def cssjanus.Tokenizer.DeTokenize (   self,
  line 
)
Replaces tokens with the original string.

Args:
  line: A line with tokens.

Returns:
  line with any tokens replaced by the original string.

Definition at line 237 of file cssjanus.py.

References cssjanus.Tokenizer.originals, and cssjanus.Tokenizer.token_string.

def cssjanus.Tokenizer.Tokenize (   self,
  line 
)
Replaces any string matching token_re in line with string tokens.

By passing a function as an argument to the re.sub line below, we bypass
the usual rule where re.sub will only replace the left-most occurrence of
a match by calling the passed in function for each occurrence.

Args:
  line: A line to replace token_re matches in.

Returns:
  line: A line with token_re matches tokenized.

Definition at line 220 of file cssjanus.py.

References cssjanus.Tokenizer.TokenizeMatches().

def cssjanus.Tokenizer.TokenizeMatches (   self,
  m 
)
Replaces matches with tokens and stores the originals.

Args:
  m: A match object.

Returns:
  A string token which replaces the CSS comment.

Definition at line 256 of file cssjanus.py.

References cssjanus.Tokenizer.originals, and cssjanus.Tokenizer.token_string.

Referenced by cssjanus.Tokenizer.Tokenize().


Member Data Documentation

Definition at line 213 of file cssjanus.py.


The documentation for this class was generated from the following file: