New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(connect-queue) optionally change the connect queue threshold #729

Merged
merged 5 commits into from Jan 12, 2019

Conversation

Projects
None yet
4 participants
@dannyBies
Copy link
Contributor

dannyBies commented Jan 11, 2019

As discussed in #134 I implemented the suggested feature and added a test to verify this behaviour.
After experimenting a bit more I have been able to get performance gains of around 60% on the initial loading of my screens by disabling the connect-queue.

Note:
I have added enableConnectQueue to make sure that the connectQueue can be enabled again after calling disableConnectQueue(). Technically this can be achieved by setConnectQueueThreshold(100) but I don't think developers should have to know that the default is 100. I can remove disableConnectQueue if needed.

@fkleuver

This comment has been minimized.

Copy link
Member

fkleuver commented Jan 11, 2019

Excellent, thanks for taking the initiative and also adding a test :)

Just one request. Would you mind also creating a test that verifies enableConnectQueue and disableConnectQueue? I know it may seem a bit redundant/obvious, but I would just like to make sure that 1) it's covered and 2) tests protect against regressions in the future should someone touch this code again for one reason or another. It's one of the things that should help vCurrent to keep working and stay robust even in the future.

Also I think it's preferable to set the connect queue back to its original state at the end of the test, just so it doesn't influence other tests.

Other than that, looks solid!

@CLAassistant

This comment has been minimized.

Copy link

CLAassistant commented Jan 12, 2019

CLA assistant check
All committers have signed the CLA.

@dannyBies

This comment has been minimized.

Copy link
Contributor

dannyBies commented Jan 12, 2019

@fkleuver I've added the testcases. Unfortunately this has resulted in flaky tests. With every test run 2 or 3 tests randomly fail with the message Expected 'bar' to be 'baz'. I'm not familiar enough with the codebase to tell if this has to do with the implementation or the tests.

I've tried to flush the TaskQueue and call flush in connect-queue.js and tried to set immediate to 0, nothing seems to help. Do you have any ideas what the problem could be?

@fkleuver

This comment has been minimized.

Copy link
Member

fkleuver commented Jan 12, 2019

There are probably some race conditions going on with other tests because rAF has a bigger delay than any other kind of promise or timeout. You could try forcing the test to both start running after the previous rAF resolved and only finish after the next rAF resolved to ensure there are no queue leftovers from either earlier or later tests.

You could try something like this:

it('works?', done => {
  PLATFORM.requestAnimationFrame(() => {
    // test code

    PLATFORM.requestAnimationFrame(() => {
      done();
    });
  });
});

If that's still flaky, at least that gives you the appropriate timings to be trying stuff like manually flushing and whatnot. setTimeout is definitely not going to accomplish a whole lot :)

In addition, what you could do is export helper functions getConnectQueueThreshold and getConnectQueueSize (they don't necessarily need to be in the type definitions, it's just internal) to assert before/after your tests that all is indeed "clean"

@dannyBies

This comment has been minimized.

Copy link
Contributor

dannyBies commented Jan 12, 2019

That seemed to have worked, thanks for your help!
Are there any other tests you think I should add or is it okay like this?

@fkleuver

This comment has been minimized.

Copy link
Member

fkleuver commented Jan 12, 2019

Nice, thank you for fixing this!
I don't think additional tests are needed. The assertions are very precise so this looks solid both for verifying the behavior and for preventing regressions.
LGTM @EisenbergEffect

@EisenbergEffect

This comment has been minimized.

Copy link
Member

EisenbergEffect commented Jan 12, 2019

Nice. Tests and all!

@EisenbergEffect EisenbergEffect merged commit 45a2aca into aurelia:master Jan 12, 2019

2 checks passed

WIP Legacy commit status override — see details
Details
license/cla Contributor License Agreement is signed.
Details
@dannyBies

This comment has been minimized.

Copy link
Contributor

dannyBies commented Jan 15, 2019

Thanks for merging this @EisenbergEffect! When do you think this will be released?

@EisenbergEffect

This comment has been minimized.

Copy link
Member

EisenbergEffect commented Jan 15, 2019

I've got a slew of new releases for the various libraries that are on my plate to get out. I think I'll probably start working on getting those out tomorrow. Not sure if binding will be in tomorrow's batch or not. It will depend on how the process goes. It should be this week though.

Side note: Our vCurrent release process isn't as fast as we'd like but we've done massive work for vNext to make this better. We're actually able to do nightly builds for vNext now and with a click of a button I can turn any of those into an official release. So, speed of release is something we've worked on, just much easier to do with our new monorepo than with our vCurrent multi-repo approach. So, thanks for your patience.

@dannyBies

This comment has been minimized.

Copy link
Contributor

dannyBies commented Jan 15, 2019

That's great to hear. Thanks and keep up the good work!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment