-
Notifications
You must be signed in to change notification settings - Fork 3.5k
Allow 'foo . (42)' calls #922
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@@ -232,8 +232,13 @@ tokenize([$.,T|Rest], Line, Scope, Tokens) when ?comp1(T); ?op1(T); T == $& -> | |||
% Dot call | |||
|
|||
% ## Exception for .( as it needs to be treated specially in the parser | |||
tokenize([$.,$(|Rest], Line, Scope, Tokens) -> | |||
tokenize([$(|Rest], Line, Scope, add_token_with_nl({ dot_call_op, Line, '.' }, Tokens)); | |||
tokenize([$.,T1,T2|T], Line, Scope, Tokens) when T1 == $( ; (T1 == $\s andalso T2 == $() -> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The tricky thing is: it could also be foo. (42)
or foo.\n(42)
. I think the best way to handle this is to loop through all space like characters until the next non space character is found and return the rest of it. We do something similar here:
Furthermore, we have this issue not only for foo. (42)
but also foo. <<>>(42)
and more. So we would need to match all these clauses too:
If none of those clauses match, we should tokenize just the .
(i.e. this clause: https://github.com/bellthoven/elixir/blob/bd6f27a5d5c7cce3d8214822f93fa0201ecc7994/lib/elixir/src/elixir_tokenizer.erl#L376)
To sum up:
- Match on the dot
- Discard non escape characters
- Check for special dot calls (
()
,<<>>
and others) - If not special dot call, tokenize just the dot
Wow, this is great! Thanks! We are almost there, we just need to generalize this approach a bit to work with any number of spaces and with the different token combinations. |
Oh! Thanks. I was silly in taking it literally =) |
I am about to add more tests, but I am not sure about other dot calls. It seems |
|
@@ -211,6 +211,14 @@ tokenize("..." ++ Rest, Line, Scope, Tokens) -> | |||
tokenize(Rest, Line, Scope, [Token|Tokens]); | |||
|
|||
% ## Containers | |||
|
|||
tokenize([$.,T|Tail], Line, Scope, Tokens) when ?is_space(T) -> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a very elegant solution!
Fixes #901