Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG RELEASE & BETA] Detached DOM Trees Between Application Runs #11748

Closed
chadhietala opened this issue Jul 14, 2015 · 7 comments
Closed

[BUG RELEASE & BETA] Detached DOM Trees Between Application Runs #11748

chadhietala opened this issue Jul 14, 2015 · 7 comments
Assignees

Comments

@chadhietala
Copy link
Contributor

@trentmwillis and I believe there are memory leaks that only arise when running acceptance tests where a new application instance is created for each test. This has resulted in not being able to run our test suite (3500 tests) end to end because we simply run out of memory and fall over. To reproduce run the tests.

In the repro repo you will see something like the following:

This is 100 acceptance tests ran in a series. Both the listeners and nodes will continue to rise and only drop a fraction when GC comes along.

If a heap snapshot is taken you see several detached DOM trees as well

I have tested in both 1.13.4 and 2.0.0-beta-2.

Update 7/15
After looking at this with Stef last night, we concluded we should be have more DOM nodes retained per test as the test harness UI adds a row. However there seems to be retention from app instances in certain cases.

/cc @stefanpenner @wycats

Repro Repo
https://github.com/chadhietala/leaky-mcleakerson

@stefanpenner
Copy link
Member

Thanks. I suspect some misalignment with the app and the app instances is leaking something likely the instance

@chadhietala
Copy link
Contributor Author

It appears that setupForTesting introduces retained DOM nodes.

Without setupForTesting 100 test runs

With setupForTesting 100 test runs

With setupForTesting looks really strange.

@stefanpenner
Copy link
Member

I believe I have found and spiked solutions to the issues reported in the provided example.

This afternoon, I will start applying the appropriate solutions to the various offending projects.

Here is a delta comparison of before -> after heap snapshot.

screen shot 2015-07-20 at 12 06 56 pm

It's also worth nothing, after there are exactly 0 leaked Ember.Applications. I have identified at least 3 or 4 different ways Ember.Application instances leak, which ultimately resulted in this.

@stefanpenner
Copy link
Member

Alright, I think this leak is fixed. Kinda many discrete parts, so we can close it now, or close it when all the above things are merged an released.

@stefanpenner
Copy link
Member

It's worth noting, this problem is partially related to the fact that we don't purge modules are they are included. Doing so would only fix a small part of the larger leak, but maybe something worth exploring down the road.

@stefanpenner
Copy link
Member

We are going to take an additional approach and fix the issue in QUnit, so we can release the modules and not force people to null out each var.

related issue: qunitjs/qunit#841

\w @cibernox

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants