VCC will not parse a literal negative number where INT is expected #2167

Closed
slimhazard opened this Issue Dec 14, 2016 · 1 comment

Projects

None yet

4 participants

@slimhazard
Contributor

Expected Behavior

VCC should parse a literal negative number such as -10 where an INT is expected.

The C typedef for VCL_INT is (signed) long, so it's not a problem with the data type, just the parser.

Current Behavior

When VCC sees -, it emits this error message and fails the parse:

Message from VCC-compiler: Unknown token '-' when looking for INT

Both of these workarounds will get a negative value into the INT:

std.integer("-10", 0)
0 - 1   # i.e. 1 subtracted from 0

std.integer("-10", -10) doesn't work, because INT is specified for the fallback parameter, so the same error is raised.

Possible Solution

If I'm reading the VCC code right, it looks like the CNUM type will have to permit the expression to begin with -. And it that case the conversion will have to negate the result of vcc_UintVal(), which currently returns unsigned (and the comments say that it expects digits only).

Steps to Reproduce (for bugs)

Use a negative literal in VCL at any position where INT is expected.

Context

VMOD development, but this is a general VCL issue.

Your Environment

Varnish trunk (but the problem has presumably always been there)

@gquintard
Contributor
gquintard commented Jan 4, 2017 edited

One slightly less annoying way of doing it is to trick the parser with a substraction:

std.integer("-10", 0-10)
@bsdphk bsdphk added a commit that closed this issue Jan 13, 2017
@bsdphk bsdphk Allow INT and REAL to be negative.
Fixes: #2167
ca438c8
@bsdphk bsdphk closed this in ca438c8 Jan 13, 2017
@hermunn hermunn added a commit that referenced this issue Feb 8, 2017
@bsdphk @hermunn bsdphk + hermunn Allow INT and REAL to be negative.
Fixes: #2167

Conflicts:
	bin/varnishtest/tests/m00019.vtc
	lib/libvcc/vcc_expr.c
efc000f
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment