-
Notifications
You must be signed in to change notification settings - Fork 194
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
What's wrong with stream or how to use it properly? #86
Comments
Hello, Can you please help by providing a minimal repro. This includes:
Thanks for the report! |
if u use You can clone code I left link above and test yourself. |
On first glance - you're not providing a local source of Truth (disk cache) and you are setting your memory cache to expire. I this case the fetcher will be called when the cache is evicted. Can you confirm by one of the following:
|
local source of Truth is database or what? Removing the cache expiration still trigger the several times with same state and origin and key. I even tried very simple configuration:
Still similar result. Please try my example I submitted here early. Updated: |
are you using alpha01? we had a bug that always triggered fetcher when there is no source of truth, we've fixed it in alpha02. |
For file based persister, you can check out |
@yigit You know even if use StoreRequest.fresh the behaviour still the same. Then if persister is so essential and mandatory what is point to use store if I only need temporary memory/file cache for example? At least library should not allow to build store without persister and throw error during compilation (or at runtime). |
Correct coordinate is |
Hi @atonamy, in order for allow us to investigate. Can you please provide a minimal repro. The current repro you provided includes |
I've forked your app and added more logging, If you look at these logs: You can see that it always gets a new block id so it is a new key for a new request. I'm not sure if i'm looking at the right part of the code, as @eyalgu mentioned, it would be much more clear to understand if this was a sample app but from what i can see in this app, it keeps sending different keys all the time hence getting new values. here is a sample output from a run:
Lmk if i'm looking at the wrong thing. |
Yes this #89 crashing issue also exists but if you have a look log carefully you may notice with this different keys Sorry I'm a bit busy, but I can create more testable solution maybe on weekends if it still valuable. |
Not sure if relevant, but in your original example I noticed that you launch one coroutine and in it iterate over all your keys and collect the streams on them. A store stream does not finish collecting, so in that example the code will only start to collect the first key and will never get to the second. Again that may be just specific to the example you have so a full repro would be the best next step here. Thanks again sure trying out store and taking the time to report bugs! |
@eyalgu but collection streams always happening with new unique keys in this example. So even if previous streams is not finished how come it affect new streams which is not related to previous? And u can even wait when previous stream collection will finish you still will get this duplicate calls on collection with new stream and with new different keys anyway. You can even try to cancel coroutine after complition which collect stream and new stream still produce duplicate collection with same origin and state. Update: Need more tests to confirm. |
In my log, it always gets called w/ a new key so store makes 1 request for each new key. Not sure if I understand the problem correctly, that code seems to work as expected. If there are different keys, there will be different requests. I feel like i'm still not understanding what the problem is :/. Every time you call the |
When we collect stream with cache and new key it first goes to fetcher since we don't have a cache yet because of new key so the first we receive in Now if start to collect multiple streams in parallel the above described procedure the same for each stream with new unique key ? Then if we split multiple streams collection in several batches with equal quantity of parallel streams which is run sequentially when one batch of multiple streams collection is complete and start another one we can see that described above behavior is different now the successful status with fetcher in Do I clearly described the problem? In all streams and batches the keys is always new and unique never reuse same key anywhere in any stream in this scenario. |
You had me until the last paragraph. To be honest the simplest way to get a fix is to submit a failing test case. This will both allow us to reproduce your issue and verify a fix. Seems like you might be hitting on some use case we did not consider. |
@digitalbuddha no problem let's wait untill end of the week I'll try my best to write unit tests on weekend with isolated and mocking infrastructure if it still will be the case. thank you |
Closing, please reopen when you have a verifiable test case |
Let's review this simple example
the expected outcome should be that
CHECKING TRIGGER
will trigger once for each unique key even iffetchFromCollection
executed several times (each time always unique keys)?If I execute
fetchFromCollection
with one unique set of keys it will run as expected, but if I execute second timefetchFromCollection
with another different set of keys it will triggerCHECKING TRIGGER
condition more than once(new set of unique keys).What I miss? Why it doesn't work as expected?
And then if execute
fetchFromCollection
three, four times and so on each time with new set of keys stream will just hang with Loading state forever.I pushed this example project demonstrating the issue in full scale.
The text was updated successfully, but these errors were encountered: