Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lazy evaluation / hybrid push-pull execution model #323

Open
wants to merge 137 commits into
base: main
Choose a base branch
from

Conversation

dphfox
Copy link
Owner

@dphfox dphfox commented Apr 15, 2024

Closes #144. Implements Reactively's hybrid push-pull algorithm for Fusion's graph objects, as described on Technical Fluff and as described by Reactively's author.

@dphfox
Copy link
Owner Author

dphfox commented Apr 15, 2024

Closes #303.

@dphfox dphfox linked an issue Apr 15, 2024 that may be closed by this pull request
@dphfox dphfox self-assigned this Apr 15, 2024
@dphfox dphfox mentioned this pull request Apr 16, 2024
@dphfox
Copy link
Owner Author

dphfox commented Apr 16, 2024

Diverged a bit from the algorithm as written. Now, it only paints validity/invalidity, as well as storing a change time for avoiding computations for non-meaningful changes.

@dphfox
Copy link
Owner Author

dphfox commented Apr 16, 2024

New algorithm documented on Technical Fluff: https://fluff.blog/2024/04/16/monotonic-painting.html

@dphfox dphfox linked an issue May 18, 2024 that may be closed by this pull request
@dphfox dphfox marked this pull request as ready for review May 18, 2024 20:49
@dphfox dphfox linked an issue May 26, 2024 that may be closed by this pull request
@dphfox
Copy link
Owner Author

dphfox commented Jun 15, 2024

So I've been playing around with the bugs in this branch for a short while this evening. As best as I can tell, this branch messes up the lifetimes of something somehow - I suspect something that promises to create a new inner scope for a callback is instead returning an outer scope somewhere. Just a hunch.

@dphfox
Copy link
Owner Author

dphfox commented Jun 16, 2024

Fixed some of the major lifetime issues - after a lot of diagnosis, it turns out the issue was that New calls were being cached, which is incorrect because scope upvalues were being used in the cached functions.

@dphfox
Copy link
Owner Author

dphfox commented Jun 17, 2024

image

Ported the Word Game example just fine... so looks like I'll need a more complex example to figure out what's going wrong.

@dphfox
Copy link
Owner Author

dphfox commented Jun 21, 2024

if you create a tween with a repeat count of -1 then the first time its evaluated self._activeFrom is nil so it sets self._activeElapsed to self._activeDuration (which is infinity) which then causes getTweenRatio to return nan so it gives a warning

Will need to fix this

@dphfox
Copy link
Owner Author

dphfox commented Jun 21, 2024

Will also need to fix the infinite loop check in change to allow for cyclic changes if they happen through a dependency that doesn't meaningfully change (likely, this just means allowing cycles in change).

@dphfox
Copy link
Owner Author

dphfox commented Jul 14, 2024

Something I realise now about the way that execution should work; the oldest graph objects should be evaluated first.

Here's why; suppose that there's an Observer A which creates or destroys another Observer B watching the same state. This is a simple dynamic graph.

First observation: every time an evaluate runs, we no longer know what the graph is going to look like. The current code does not account for this, so scope == nil checks should be added.

Second observation: intuitively, if Observer A decides to destroy B, then B should not be given the chance to evaluate. This isn't guaranteed either because eager evaluation order is not specified by change. So Fusion should track creation time and sort by it before executing eager objects.

https://fluff.blog/2024/07/14/glitches-in-dynamic-reactive-graphs.html

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants