You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It is unclear to me precisely how the thing works, as it can untokenize the array tokens, while passing a list containing only one token from tokens will untokenize all elements of tokens up to that point. When iterating over tokens, untokenize seems to emit nothing if given tokens[n] where n hasn't been iterated yet…or something.
I have no idea how this thing behaves. It's doing nothing logical.
The text was updated successfully, but these errors were encountered:
token_utils is designed to be able to 1) recreate exactly the original source from a list of tokens; 2) easily allow the substitution of the string content of one or more tokens by other strings, recreating a transformed source with all the previous spacing between previous tokens as they were in the original source.
Single tokens are objects that carry information including the entire content of the line where they came from. The untokenize function makes use of this content to know where to insert the string content of a given token, thus preserving the spacing between tokens.
For token_utils, untokenizing and printing a single token amounts to simply doing print(token.string).
Alternatively, to untokenize a single token as index 4, like you do, an overcomplicated way of doing it would be:
The following code fragment in useful context:
will print
t.tokens[0:4]
for some reason.It is unclear to me precisely how the thing works, as it can untokenize the array
tokens
, while passing a list containing only one token fromtokens
will untokenize all elements oftokens
up to that point. When iterating overtokens
,untokenize
seems to emit nothing if giventokens[n]
wheren
hasn't been iterated yet…or something.I have no idea how this thing behaves. It's doing nothing logical.
The text was updated successfully, but these errors were encountered: