Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(lifecycle): new callbacks, async tasks, and updated template controllers #171

Merged
merged 22 commits into from
Sep 27, 2018

Conversation

EisenbergEffect
Copy link
Contributor

@EisenbergEffect EisenbergEffect commented Sep 15, 2018

Background

Previously, when a renderable (custom element or view) was told to detach, it always removed its DOM nodes, even if its parent had already been removed from the DOM. This behavior is a carryover from vCurrent and is highly inefficient. This PR attempts to address the issue.

Solution

There are several pieces to this:

  1. Update the DetachLifecycle so that when a renderable owns the lifecycle, only its nodes are removed.
  2. Update the DetachLifecycle so that if a non-renderable owns the lifecycle (e.g. Repeat), only top-level renderables (the views managed by the Repeat, not children of those views) have their nodes removed.
  3. Abstract node removal for renderables so that renderables can individually track whether or not their nodes are addeded.
  4. Improve View's abstraction for assigning a location for the view by adding a mount API to replace the previous onRender callback.
  5. Update renderables to only add their nodes when they are not already mounted.
  6. Add special behavior for containerless custom elements, since they are the only custom element scenario where the actual nodes need removal. (This is because all other scenarios are handled by the fact that the container is removed.)

Note: As part of this PR, some vestigial code left over from the previous Compose refactor was removed. A few minor refactors/renames were performed as well.

Related Issues

TODO

All existing tests pass but new tests that describe lifecycle behavior need to be added. Once those are added, this PR will be ready.

@codeclimate
Copy link

codeclimate bot commented Sep 15, 2018

Code Climate has analyzed commit fb8fbd2 and detected 21 issues on this pull request.

Here's the issue category breakdown:

Category Count
Complexity 8
Duplication 4
Style 8
Bug Risk 1

The test coverage on the diff in this pull request is 75.4% (50% is the threshold).

This pull request will bring the total coverage in the repository to 86.1% (-0.9% change).

View more on Code Climate.

@AureliaEffect
Copy link
Member

This pull request introduces 1 alert when merging 1c288e0 into 978c2f0 - view on LGTM.com

new alerts:

  • 1 for Unused variable, import, function or class

Comment posted by LGTM.com

@AureliaEffect
Copy link
Member

@AureliaEffect
Copy link
Member

@AureliaEffect
Copy link
Member

@bigopon
Copy link
Member

bigopon commented Sep 16, 2018

Does this have any implication on granularity of each View that doesn't own the LifeCycle ?

@EisenbergEffect
Copy link
Contributor Author

Can you clarify the question or provide an example? I'm not quite sure.

@AureliaEffect
Copy link
Member

@bigopon
Copy link
Member

bigopon commented Sep 16, 2018

For views with behaviors, callback life cycle detached means the view is detached and put back in its own place, without the parent location that it was attached to. With this change, from what I understand, it's no longer true. It could affect caching?

@EisenbergEffect
Copy link
Contributor Author

Oh, man, that's a problem probably. I'll have to think about that. It might be worth it to just remove caching. This new behavior is probably more important. I'm not sure.

@EisenbergEffect
Copy link
Contributor Author

EisenbergEffect commented Sep 16, 2018

Well, for starters, when we return a view to cache, we would need to:

  • null out the location
  • remove the nodes

However, there's a trick to this, since we don't want to remove the nodes or return to cache at the wrong time WRT the lifecycle itself. So, we may need to tweak these APIs more.

@EisenbergEffect
Copy link
Contributor Author

This stinks...massive complication I think. Ugh...

@EisenbergEffect
Copy link
Contributor Author

Maybe we can make it work with a requirement that you must own the lifecycle object to return things to the cache? Thoughts?

@bigopon
Copy link
Member

bigopon commented Sep 16, 2018

It might be worth it to just remove caching.

Caching requires lowest amount of LOC and effort for the same amount of perf gain, I think it will be very hard to justify removing caching. I remember repeat used to have it own caching mechanism and then later merged into part of templating. Maybe you can give some rationale behind that move so we can have stronger base for discussion?

Maybe we can make it work with a requirement that you must own the lifecycle object to return things to the cache? Thoughts?

From what it looks like, at any point, there is only one owner. This seems limited, and I think it's inappropriate to put that restriction on.

@EisenbergEffect
Copy link
Contributor Author

I'll probably need to rethink the entire detach lifecycle again then 😭

@bigopon
Copy link
Member

bigopon commented Sep 16, 2018

For your inspiration, there is this plugin from @huochunpeng https://github.com/buttonwoodcx/bcx-aurelia-reorderable-repeat , it may / may not employ caching, but it shows the strength of having granularity in control: ability to have any arbitrary way to implement the behavior that fits your need.

@EisenbergEffect
Copy link
Contributor Author

I'll explore the link...but I think it actually makes sense that something should only return to cache when it's owner controls the lifecycle. Otherwise, you get tons of really inefficient behavior for what basically amounts to an alternate form of caching.

So, the idea being that if Repeat is explicitly adding and removing views, then it's going to own the lifecycle and be able to determine that certain views should go back to the cache, but if there's a view several layers above that Repeat, and it owns the lifecycle (or a Repeat that's calling it) then only it can be put in/out of the cache. There's no need to do anything with the children. That would just be extra work.

Does that make sense?

@fkleuver
Copy link
Member

fkleuver commented Sep 16, 2018

Problem is that efficient caching means components often need to know something about their ancestors and descendants. This is something that I managed to solve partly in binding with the flags, because an ancestor can tell a descendant what it's doing and the descendant can, based on that, decide whether or not to actually remove the views during $unbind. I think something similar can be done with attach/detach.

I got a few ideas on how to improve this but I need to sit down for a bit on that. For now, it's important that we get a few meaningful benchmarks first so we can play around with different things and compare the impact rather than theorize about it. Perfect caching is impossible anyway as some assumptions need to be made.

@bigopon
Copy link
Member

bigopon commented Sep 16, 2018

@EisenbergEffect Yes, that makes perfect sense. It seems to me only view from template controller will behave differently comparing to normal views, which means it could very much be the responsibility of the template controller implementation do decide what to do with those views. I think it's safe to assume that there shouldn't be any issue. My previous concern was probably unnecessary.

Beside that, I think another potential issue could be memory leak where view nodes has reference to view model, either directly or through a model object. But maybe I'm just over-worrying.

@fkleuver
Copy link
Member

fkleuver commented Sep 16, 2018

I think caching should be done first and foremost on template string level, preferably via some globally accessible view factory, much like the expression parser cache. Key difference being of course that you can have multiple cached elements per template string.

A repeater creates 10.000 of some view, returns it to cache, then there will be 10.000 cached views for that particular string available for any other component to grab. This becomes a memory problem when you have dynamically generated templates that differ and are never reused, you'll have an ever-expanding cache. This is another reason to have caching done globally - a cache size limit will be able to keep dynamic templates in check, not just large amounts of the same single template

Beside that, I think another potential issue could be memory leak where view nodes has reference to view model, either directly or through a model object. But maybe I'm just over-worrying.

This is a valid concern and it should be the responsibility of $unbind and $detach to make sure no references linger. We need to test for this

@fkleuver
Copy link
Member

fkleuver commented Sep 16, 2018

It might be worth it to just remove caching.

Maybe you can give some rationale behind that move so we can have stronger base for discussion?

I can give a rationale for that. "premature optimization is the root of all evil" (and yes, I'm very guilty myself). I think if we remove caching for the time being (and take the animator along with it please) we'll significantly simplify the templating code base, allowing us to focus on getting everything to work correctly first. When everything is 100%, tested, proven, and beautiful, we can build caching back in (and have the tests tell us exactly when we're doing something wrong)

Please let's stop trying to get the full vCurrent feature/perf surface all working in one go and try and do things a bit more incrementally. It will take less time if we toss a few things out and add them back in later, because we can all see more clearly in the meantime.

Edit 2: just to add some proof to the above assertion, I didn't rework observation from vCurrent either. I started completely from scratch with 95% of features missing. I would have lost my sanity if I tried to do it in any other way

@EisenbergEffect
Copy link
Contributor Author

I think it's possible that you are missing out on some of the conversations (and painful experiences) we had with vCurrent on the core team in the past @fkleuver One of the biggest issues with vCurrent is that we waited too long to integrate caching...and as a result, created a lot of problems. Animation was also integrated in a poor way and caused complexity and oddity throughout. The vNext template system, while lacking many tests, has been rebuilt incrementally and is technically almost feature-complete, with the exception of animation. For me, the next step is working with fixing a core issue of vCurrent around how nodes are added/remove in hierarchies. This is an issue that has bothered me for over 3 years and is one I've literally waited on for 9mo since the vNext effort was started. I believe the time is right to revisit it. After that, things will be in place to handle animation. Some of the work to set that up is part of this. Some of the work was part of the template controller re-writes I did, that established the patterns that would enable this. So, this isn't out of nowhere for me, it's been on my radar for a while and I've been strategically doing other incremental work to prepare the way for this.

@fkleuver
Copy link
Member

I understand how you've been working towards this, and it makes perfect sense for the most part. I'm also trying to understand and help ensure that all design decisions are coherent with some of the fundamental changes that have been made in vNext.

All that said, whenever you work on something it always comes out a lot better so I'll wait this one out :)

@EisenbergEffect
Copy link
Contributor Author

I think I need to probably write a little spec for this, to think through some of the scenarios. The PR here (once it has tests) is not a bad first crack. It's definitely an improvement, but not quite what it needs to be.

@codecov
Copy link

codecov bot commented Sep 19, 2018

Codecov Report

Merging #171 into master will decrease coverage by 0.91%.
The diff coverage is 75.55%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #171      +/-   ##
==========================================
- Coverage   86.19%   85.28%   -0.92%     
==========================================
  Files          78       78              
  Lines        6020     6257     +237     
  Branches     1066     1109      +43     
==========================================
+ Hits         5189     5336     +147     
- Misses        831      921      +90
Impacted Files Coverage Δ
packages/runtime/src/templating/index.ts 100% <ø> (ø) ⬆️
packages/runtime/src/templating/renderable.ts 100% <ø> (+4.76%) ⬆️
...ackages/runtime/src/templating/rendering-engine.ts 96.33% <100%> (-0.07%) ⬇️
packages/runtime/src/templating/resources/with.ts 100% <100%> (ø) ⬆️
...es/runtime/src/templating/resources/replaceable.ts 100% <100%> (ø) ⬆️
packages/runtime/src/dom.ts 89.58% <100%> (+4.58%) ⬆️
packages/runtime/src/templating/renderer.ts 94.26% <100%> (ø) ⬆️
packages/runtime/src/templating/lifecycle.ts 62.43% <62.92%> (-26.66%) ⬇️
packages/runtime/src/templating/custom-element.ts 71.28% <63.63%> (-0.98%) ⬇️
...ackages/runtime/src/templating/custom-attribute.ts 82.55% <70.58%> (-3.87%) ⬇️
... and 10 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 9eb0764...fb8fbd2. Read the comment docs.

@AureliaEffect
Copy link
Member

@AureliaEffect
Copy link
Member

@AureliaEffect
Copy link
Member

@AureliaEffect
Copy link
Member

@AureliaEffect
Copy link
Member

This pull request fixes 1 alert when merging 63959e7 into 57c07e8 - view on LGTM.com

fixed alerts:

  • 1 for Unused variable, import, function or class

Comment posted by LGTM.com

@AureliaEffect
Copy link
Member

This pull request fixes 1 alert when merging 039309a into 57c07e8 - view on LGTM.com

fixed alerts:

  • 1 for Unused variable, import, function or class

Comment posted by LGTM.com

@bigopon
Copy link
Member

bigopon commented Sep 21, 2018

@EisenbergEffect From the commit, I understand that if we want to implement a list with staggering enter animation, we need to scope lifecycle to list and create a child for each list item for both detached and attached ?

@EisenbergEffect
Copy link
Contributor Author

For staggering enter animation, you don't need to do anything with the lifecycle. The lifecycle is now generalized to simply allow a task to block node add and attached callback or block node remove and detached callback. It's a general hook that an animation system could use to prevent nodes from being removed before an animation finishes (also enables synchronous compose and other scenarios). I don't think an animation system would use that for attach since you don't want to block adding nodes. This removes the need for the animation system to have to exist in the core and allows it to exist entirely as a plugin. It's possible we'll still have a standard mechanism like before, but that doesn't need to live inside the runtime now.

Tests broken for now. Fixups coming soon.
@AureliaEffect
Copy link
Member

This pull request introduces 4 alerts and fixes 1 when merging 14bbed6 into 57c07e8 - view on LGTM.com

new alerts:

  • 4 for Unused variable, import, function or class

fixed alerts:

  • 1 for Unused variable, import, function or class

Comment posted by LGTM.com

@AureliaEffect
Copy link
Member

This pull request introduces 3 alerts and fixes 1 when merging 3e1c98e into 57c07e8 - view on LGTM.com

new alerts:

  • 3 for Unused variable, import, function or class

fixed alerts:

  • 1 for Unused variable, import, function or class

Comment posted by LGTM.com

@AureliaEffect
Copy link
Member

This pull request introduces 4 alerts and fixes 1 when merging d540ea0 into 57c07e8 - view on LGTM.com

new alerts:

  • 4 for Unused variable, import, function or class

fixed alerts:

  • 1 for Unused variable, import, function or class

Comment posted by LGTM.com

@AureliaEffect
Copy link
Member

This pull request introduces 6 alerts and fixes 1 when merging b2fef29 into 57c07e8 - view on LGTM.com

new alerts:

  • 6 for Unused variable, import, function or class

fixed alerts:

  • 1 for Unused variable, import, function or class

Comment posted by LGTM.com

@AureliaEffect
Copy link
Member

This pull request introduces 4 alerts and fixes 1 when merging f120c74 into 57c07e8 - view on LGTM.com

new alerts:

  • 4 for Unused variable, import, function or class

fixed alerts:

  • 1 for Unused variable, import, function or class

Comment posted by LGTM.com

@EisenbergEffect EisenbergEffect changed the title WIP feat(lifecycle): minimal add/remove of nodes feat(lifecycle): new callbacks, async tasks, and updated template controllers Sep 27, 2018
@AureliaEffect
Copy link
Member

This pull request introduces 1 alert and fixes 1 when merging fb8fbd2 into 9eb0764 - view on LGTM.com

new alerts:

  • 1 for Unused variable, import, function or class

fixed alerts:

  • 1 for Unused variable, import, function or class

Comment posted by LGTM.com

@EisenbergEffect EisenbergEffect merged commit 7a9a635 into master Sep 27, 2018
@EisenbergEffect EisenbergEffect deleted the lifecycle-improvements branch September 27, 2018 05:39
gitbook-com bot pushed a commit that referenced this pull request Sep 1, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants