You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As discussed on Gitter, sometimes a parser ends up being used twice for parsing the same piece of text: first time in tokenizer, and second one via Apply in the parser. This is because if we use a enum for TKind there is nowhere to store the initial parsing result.
Instead of using a enum for result of tokenization, thus it could be advisable using a normal class, that has the token type as a field. This way the parser will have access to the parsed result and will not need to call Apply to execute parsing again.
To facilitate this approach we would use Matching instead of Token.EqualTo to be able to select on type, which now became object field.
Matching method could look like this:
public static TokenListParser<TKind, Token<TKind>> Matching<TKind>(Func<TKind, bool> predicate, string name)
{
if (predicate == null) throw new ArgumentNullException(nameof(predicate));
if (name == null) throw new ArgumentNullException(nameof(name));
return Matching(predicate, new[] { name });
}
private static TokenListParser<TKind, Token<TKind>> Matching<TKind>(Func<TKind, bool> predicate, string[] expectations)
{
if (predicate == null) throw new ArgumentNullException(nameof(predicate));
if (expectations == null) throw new ArgumentNullException(nameof(expectations));
return input =>
{
var next = input.ConsumeToken();
if (!next.HasValue || !predicate(next.Value.Kind))
return TokenListParserResult.Empty<TKind, Token<TKind>>(input , expectations);
return next;
};
}
The text was updated successfully, but these errors were encountered:
As discussed on Gitter, sometimes a parser ends up being used twice for parsing the same piece of text: first time in tokenizer, and second one via
Apply
in the parser. This is because if we use aenum
for TKind there is nowhere to store the initial parsing result.Instead of using a
enum
for result of tokenization, thus it could be advisable using a normal class, that has the token type as a field. This way the parser will have access to the parsed result and will not need to callApply
to execute parsing again.To facilitate this approach we would use
Matching
instead ofToken.EqualTo
to be able to select on type, which now became object field.Matching
method could look like this:The text was updated successfully, but these errors were encountered: