The multi-tool of lexical analysis and tokenization.
pip install plrs
maturin build
EOF_TOKEN
Tokens
Settings
Token
- part
- token
- set_part
- set_token
- __str__
- __repr__
Lexer
- new
- char_forward
- skip_over_char_set
- next
is_char_symbol
is_char_operator
is_char_whitespace
is_char_numeric
is_single_quote
is_double_quote
ends_token
is_part_numeric
tokenize