class documentation

A tokenizer for string annotations.

Class Method build Undocumented
Static Method recombine_sets Merge the special literal choices tokens together.
Static Method tokenize_str Split the string in tokens for further processing.
Method __init__ Undocumented
Instance Variable tokens The tokens.
Instance Variable warnings The warnings trigered during the tokenization.
Static Method _additional_warnings Some more warnings.
Static Method _token_type Find the type of a token. Types are defined in TokenType enum.
def build(cls, raw_tokens: list[object], warnings: list[str], warns_on_unknown_tokens: bool) -> list[Token]: (source)

Undocumented

def recombine_sets(tokens: list[Any]) -> list[object]: (source)

Merge the special literal choices tokens together.

Example

>>> tokens = ["{", "1", ", ", "2", "}"]
>>> Tokenizer.recombine_sets(tokens)
['{1, 2}']
def tokenize_str(spec: str) -> list[str]: (source)

Split the string in tokens for further processing.

def __init__(self, annotation: str, *, warns_on_unknown_tokens: bool): (source)
warnings: list[str] = (source)

The warnings trigered during the tokenization.

def _additional_warnings(tokens: list[Token], warnings: list[str]): (source)

Some more warnings.

def _token_type(token: object, warnings: list[str], warns_on_unknown_tokens: bool) -> TokenType: (source)

Find the type of a token. Types are defined in TokenType enum.