Skip to content

Commit

Permalink
Fixed bug that caused void(0) to be interpreted as a keyword instead …
Browse files Browse the repository at this point in the history
…of a function call, thus causing a ParentheticalNode to be created and causing the ecma visitor to output void((0))
  • Loading branch information
camertron authored and Doug McInnes committed Nov 3, 2011
1 parent 4f6de83 commit 129edd3
Show file tree
Hide file tree
Showing 3 changed files with 18 additions and 3 deletions.
3 changes: 2 additions & 1 deletion lib/parser.y
Original file line number Diff line number Diff line change
Expand Up @@ -181,7 +181,8 @@ rule
;

CallExprNoBF:
MemberExprNoBF Arguments { result = FunctionCallNode.new(val[0], val[1]) }
VOID '(' MemberExpr ')' { result = FunctionCallNode.new(ResolveNode.new(val[0]), val[2]) }
| MemberExprNoBF Arguments { result = FunctionCallNode.new(val[0], val[1]) }
| CallExprNoBF Arguments { result = FunctionCallNode.new(val[0], val[1]) }
| CallExprNoBF '[' Expr ']' { result = BracketAccessorNode.new(val[0], val[2]) }
| CallExprNoBF '.' IDENT { result = DotAccessorNode.new(val[0], val[2]) }
Expand Down
2 changes: 1 addition & 1 deletion lib/rkelly/tokenizer.rb
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,7 @@ def initialize(&block)
[value, value]
end
end

def tokenize(string)
raw_tokens(string).map { |x| x.to_racc_token }
end
Expand Down
16 changes: 15 additions & 1 deletion test/test_tokenizer.rb
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ def test_increment
], tokens)
end

def test_regex
def test_regular_expression
tokens = @tokenizer.tokenize("foo = /=asdf/;")
assert_tokens([
[:IDENT, 'foo'],
Expand All @@ -74,6 +74,20 @@ def test_regex
], tokens)
end

def test_regular_expression_invalid
tokens = @tokenizer.tokenize("foo = (1 / 2) / 3")
assert_tokens([[:IDENT, "foo"],
["=", "="],
["(", "("],
[:NUMBER, 1],
["/", "/"],
[:NUMBER, 2],
[")", ")"],
["/", "/"],
[:NUMBER, 3]
], tokens)
end

def test_regular_expression_escape
tokens = @tokenizer.tokenize('foo = /\/asdf/gi;')
assert_tokens([
Expand Down

0 comments on commit 129edd3

Please sign in to comment.