Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

jsonnet-lint hangs under certain conditions #541

Open
gotwarlost opened this issue Jun 7, 2021 · 3 comments · May be fixed by #555
Open

jsonnet-lint hangs under certain conditions #541

gotwarlost opened this issue Jun 7, 2021 · 3 comments · May be fixed by #555

Comments

@gotwarlost
Copy link
Contributor

gotwarlost commented Jun 7, 2021

OK, I'm down to one file that jsonnet-lint refuses to process. It basically hangs and ultimately gets killed by what I'm assuming is an OOM killer on my Mac. The weird thing is that sometimes it will return just fine in sub-second.

That file has all sorts of proprietary stuff in it so I've tried my best to create a repro test case but you'll have to tell me if it worked :)

  • Clone this repo: https://github.com/gotwarlost/jsonnet-lint-issues
  • run bash-loop.sh - it should show you processing times increase as the number of keys in the caller.jsonnet map increases and will eventually reach a point where jsonnet-lint doesn't return.

I've no idea what's going but it feels like some sort of quadratic/ exponential processing, or some lock contention. As I mentioned, the weird part is that every so often it will return just fine.

@gotwarlost
Copy link
Contributor Author

This is the output of running the bash loop on my machine

$ ./bash-loop.sh
===
20
===

real	0m0.045s
user	0m0.029s
sys	0m0.011s
===
21
===

real	0m0.071s
user	0m0.057s
sys	0m0.021s
===
22
===

real	0m0.122s
user	0m0.105s
sys	0m0.032s
===
23
===

real	0m0.246s
user	0m0.206s
sys	0m0.072s
===
24
===

real	0m0.420s
user	0m0.370s
sys	0m0.113s
===
25
===

real	0m0.852s
user	0m0.743s
sys	0m0.230s
===
26
===

real	0m1.662s
user	0m1.451s
sys	0m0.449s
===
27
===

real	0m3.288s
user	0m2.925s
sys	0m0.868s
===
28
===

real	0m8.294s
user	0m6.533s
sys	0m2.430s
===
29
===

real	0m26.649s
user	0m15.876s
sys	0m11.228s
===
30
===

real	0m58.873s
user	0m33.493s
sys	0m29.652s
===
31
===
real	2m24.921s
user	1m8.579s
sys	1m9.135s
===
32
===
... hangs

@sbarzowski
Copy link
Collaborator

Thank you very much for reporting and for the isolated repro intructions.

I was able to reproduce it getting really slow at higher counts, with roughly 2x increase each time. This is weird and is probably fixable. For the most part the times are stable (withing ~10% of the median), but I confirm the rare cases of it finishing almost immediately (<0.01s). This appears to happen more rarely at higher counts. I don't see any obvious reason for this behavior. The only nondeterminism I can think of is the order of map elements.

I will update here once I know more.

@sbarzowski
Copy link
Collaborator

Found the issue. There is some accidental aliasing going on (a classic problem in Go...). This results in an object representation being added to itself multiple times (without removing duplicate information). Normally only already-normalized representations are added like that, so there is no explosion of this kind.

This is also a correctness issue (it may result in fewer warnings).

I have a prototype fix ready, but I need to clean it up (and solve the same issue for arrays and functions).

sbarzowski added a commit to sbarzowski/go-jsonnet that referenced this issue Aug 8, 2021
We had some accidental aliasing due to shallow copy instead of deep copy.

Fixes google#541.
@sbarzowski sbarzowski linked a pull request Aug 8, 2021 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants