Inference hangs on some files #16

Closed
marijnh opened this Issue Mar 20, 2013 · 8 comments

Projects

None yet

3 participants

@marijnh marijnh added a commit that closed this issue Mar 22, 2013
@marijnh marijnh Drop propagations when a new constraint/type generates too many of them
Crude guard against infinite expansion.

Closes #16
e776ee7
@marijnh marijnh closed this in e776ee7 Mar 22, 2013
@sergeche
Contributor

infer.js still fails on minified underscore.js, it happens when I request completions:

IndexError: RangeError: Maximum call stack size exceeded ( js/infer.js @ 43 : 21 )  ->       if (this.types.indexOf(type) > -1) return;
@marijnh
Member
marijnh commented Mar 26, 2013

It works for me now. Can you describe how you're running Tern when this happens?

@marijnh marijnh reopened this Mar 26, 2013
@sergeche
Contributor

I’m running Tern on a project with 30 files. It happens when I’m requesting completions like so: https://github.com/sergeche/ternjs-sublime/blob/master/ternjs/js/controller.js#L205

Altho, it works fine on simple project with two files. I’ll try to narrow down my tests to figure out when this error appears.

@jeffkenton

I'm seeing something similar:

With one good sized file in its own directory (jquery-1.7.js, about 9300 lines) I add the following lines of code by hand (don't cut and paste all at once) about 2/3 of the way down the file:

var x = "foo";
var y = x.length;
var x1= "bar";
var y1 = x1.length;
var x2 = "baz";
var y2 = x2.length;
var x3 = "33333";
var y3 = x3.length;

The time required for each succeeding hint increases from 700ms to 5800ms to 38900ms etc. All the time is spent in infer.js in the inferWrapper AST walk, but I don't know why. It might be the same problem you're seeing.

@marijnh
Member
marijnh commented Mar 27, 2013

@jeffkenton Are you using the current code? Some changes were made yesterday that should help with this.

@jeffkenton

Not yet. I'll try it.

Thanks.

From: Marijn Haverbeke <notifications@github.commailto:notifications@github.com>
Reply-To: marijnh/tern <reply@reply.github.commailto:reply@reply.github.com>
Date: Wednesday, March 27, 2013 4:05 PM
To: marijnh/tern <tern@noreply.github.commailto:tern@noreply.github.com>
Cc: Adobe <kenton@adobe.commailto:kenton@adobe.com>
Subject: Re: [tern] Inference hangs on some files (#16)

@jeffkentonhttps://github.com/jeffkenton Are you using the current code? Some changes were made yesterday that should help with this.


Reply to this email directly or view it on GitHubhttps://github.com/marijnh/tern/issues/16#issuecomment-15549752.

@marijnh
Member
marijnh commented Apr 5, 2013

Closing. Please file a new issue (with test case) when it comes up again.

@marijnh marijnh closed this Apr 5, 2013
@jeffkenton

Thanks.

From: Marijn Haverbeke <notifications@github.commailto:notifications@github.com>
Reply-To: marijnh/tern <reply@reply.github.commailto:reply@reply.github.com>
Date: Friday, April 5, 2013 10:30 AM
To: marijnh/tern <tern@noreply.github.commailto:tern@noreply.github.com>
Cc: Adobe <kenton@adobe.commailto:kenton@adobe.com>
Subject: Re: [tern] Inference hangs on some files (#16)

Closing. Please file a new issue (with test case) when it comes up again.


Reply to this email directly or view it on GitHubhttps://github.com/marijnh/tern/issues/16#issuecomment-15958819.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment