Skip to content

Commit

Permalink
fix: typo
Browse files Browse the repository at this point in the history
  • Loading branch information
vberlier committed Jun 17, 2021
1 parent 81ccdb9 commit 934f52c
Show file tree
Hide file tree
Showing 2 changed files with 22 additions and 7 deletions.
4 changes: 3 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,8 @@ Writing recursive-descent parsers by hand can be quite elegant but it's often a
- Checkpoints for backtracking parsers
- Works well with Python 3.10+ match statements

Check out the [`examples`](https://github.com/vberlier/tokenstream/tree/main/examples) directory for practical examples.

## Installation

The package can be installed with `pip`.
Expand Down Expand Up @@ -146,7 +148,7 @@ with stream.syntax(word=r"\w+", comment=r"#.+$"), stream.indent(skip=["comment"]

### Checkpoints

The `checkpoint()` method returns a context manager that resets the stream at the current token at the end of the `with` statement. You can use the returned `commit()` function to keep the state of the stream at the end of the `with` statement.
The `checkpoint()` method returns a context manager that resets the stream to the current token at the end of the `with` statement. You can use the returned `commit()` function to keep the state of the stream at the end of the `with` statement.

```python
stream = TokenStream("hello world")
Expand Down
25 changes: 19 additions & 6 deletions tokenstream/stream.py
Original file line number Diff line number Diff line change
Expand Up @@ -83,11 +83,11 @@ class TokenStream:
use the higher-level :meth:`checkpoint` method for this.
tokens
A list accumulating all the extracting tokens.
A list accumulating all the extracted tokens.
The list contains all the extracted tokens, even the ones ignored
when using the :meth:`ignore` method. For this reason you shouldn't
try to index into the list directly, and use methods like :meth:`expect`,
try to index into the list directly. Use methods like :meth:`expect`,
:meth:`peek`, or :meth:`collect` instead.
indentation
Expand All @@ -98,12 +98,16 @@ class TokenStream:
A set of token types for which the token stream shouldn't emit indentation
changes.
Can be set using the ``skip`` argument of the :meth:`indent` method.
generator
An instance of the :meth:`generate_tokens` generator that the stream iterates
iterates through to extract and emit tokens.
Should be considered internal.
ignored_tokens
A set of tokens types that the stream skips over when iterating, peeking,
A set of token types that the stream skips over when iterating, peeking,
and expecting tokens.
regex_cache
Expand Down Expand Up @@ -134,7 +138,11 @@ def __post_init__(self) -> None:
self.ignored_tokens = {"whitespace", "newline"}

def bake_regex(self) -> None:
"""Compile the syntax rules."""
"""Compile the syntax rules.
Called automatically upon instanciation and when the syntax rules change.
Should be considered internal.
"""
self.regex = re.compile(
"|".join(
f"(?P<{name}>{regex})"
Expand All @@ -161,6 +169,9 @@ def crop(self) -> None:
... print(stream.tokens[-1].value)
world
hello
Mostly used to ensure consistency in some of the provided context managers.
Should be considered internal.
"""
if self.index + 1 < len(self.tokens):
self.tokens = self.tokens[: self.index + 1]
Expand Down Expand Up @@ -353,7 +364,7 @@ def ignore(self, *token_types: str) -> Iterator[None]:
def current(self) -> Token:
"""The current token.
Can only be accessed if the stream started extracted tokens.
Can only be accessed if the stream started extracting tokens.
"""
return self.tokens[self.index]

Expand Down Expand Up @@ -526,6 +537,8 @@ def peek(self, n: int = 1) -> Optional[Token]:
'hello'
'world'
'hello'
>>> stream.previous.value
' '
"""
previous_index = self.index
token = None
Expand Down Expand Up @@ -796,7 +809,7 @@ def expect_eof(self) -> None:

@contextmanager
def checkpoint(self) -> Iterator[CheckpointCommit]:
"""Reset the stream to the current token at the end of the with statement.
"""Reset the stream to the current token at the end of the ``with`` statement.
>>> stream = TokenStream("hello world")
>>> with stream.syntax(word=r"[a-z]+"):
Expand Down

0 comments on commit 934f52c

Please sign in to comment.