You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
While using the same lexer object (tokenize API) and using GRPUPs (for spaces tokens), the group array is not cleaned after every lexer execution. Therefore I get wrong tokens in in the group array at the n execution of the lexer (n > 1)
regards,
Efrat
The text was updated successfully, but these errors were encountered:
bd82
changed the title
unclean lexer groups state
Lexer "Groups" state persists between different inputs tokenization.
Oct 10, 2016
Hi,
While using the same lexer object (tokenize API) and using GRPUPs (for spaces tokens), the group array is not cleaned after every lexer execution. Therefore I get wrong tokens in in the group array at the n execution of the lexer (n > 1)
regards,
Efrat
The text was updated successfully, but these errors were encountered: