…Matcher, add a useful one to Token, and confirm with a test that __slots__ does not behave in CPython according to the language docs.
…hose operating on pre-lexed streams of tokens.
…e Token to utils.
This removes the need for special token syntax in the grammar DSL, just reusing the string literals that are useless for token lists. It also should speed things up, since we no longer need to dispatch at parse time between basestrings and lists. To do: * Token doesn't belong in parsimonious.expressions. * Needs better tests: an eq_ in test_parse_success, for starters. * Documentation in readme
Avoid redundant work in the reference resolver. Lift the set of exprs that have already been resolved up beyond the scope of just _resolve_refs(). This avoids forgetting all that experience on each loop around the list comp of rules in visit_rules(). For grammars with hundreds of rules, this can be the difference between 13s and 95ms compilation times. Keeping an explicit set of done exprs means we no longer need the members=() trick anymore. Perhaps the trick would continue to save us a few calls, but it's worth those to keep the code simple.
…ained a forward reference. Bump version to 0.6.1. Fix #61. Previously, the default rule was ending up a LazyReference, and so Grammar.parse() would explode.
This way, we can reuse it for when people try to parse using AbstractGrammars. It being a VisitationError never made any sense anyway. It didn't even call the VisitationError constructor or comply with its invariants.
… based on it. To do: make all Expressions worth subclassing public.
…t subclassing; I haven't figured that out yet. @rule is better. It's more explicit, it's the same number of lines, it avoids visitors having to subclass something else, and it lets us still have visitor docstrings that actually document things.
…es". Close #47. * Add `expression()` factory for turning simple lambdas into Expressions, mostly for use internally by Grammars in converting custom rules. * Add support for "custom rules" in Grammars. These provide a hook for simple custom parsing hooks spelled as Python lambdas. If you want to abuse the packrat cache, you can even be stateful! (If this becomes useful, we'll add another arg for passing a proper blackboard around.) For heavy-duty needs, you can put in Compound Expressions with LazyReferences as subexpressions, and the Grammar will hook them up for optimal efficiency--no trips through the Grammar's hash at parse time. * Allow grammars without a default rule (in cases where there are no string rules), which leads to also allowing empty grammars. Maybe someone generating grammars dynamically will find that useful.
I don't think that's guaranteed across Python implementations.
… add the concept of a default grammar for a visitor. This makes the common case of parsing a string and applying exactly one visitor to the AST shorter and simpler. It is unlikely that a visitor would ever be used with more than one grammar, so I expect all visitors to have an easily chosen default grammar.
…rator overloading. Marcell got the wrong impression.