Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Q: Canceling of open promise commands #122

Closed
forki opened this issue Oct 30, 2017 · 6 comments
Closed

Q: Canceling of open promise commands #122

forki opened this issue Oct 30, 2017 · 6 comments

Comments

@forki
Copy link
Contributor

forki commented Oct 30, 2017

I have a series of messages that come into my system. Whenever a message comes my update function triggers Cmd.ofPromise to load/show some data. This is an expensive operation and events are coming faster sometimes. Is there a way to cancel older promises from my elmish system?

/cc @Zaid-Ajaj @et1975 @alfonsogarciacaro @MangelMaxime

@et1975
Copy link
Member

et1975 commented Oct 30, 2017

There're two things that you could do:

Batching

Load into a buffer and process the whole batch on a threshold/timer, instead of processing individual messages.

Splitting loading from rendering

You mention "show" so I'm assuming you have a non-react component that you feed the data to. You should be able to do what elmish already does: update the model on every incoming message, but show only when you get a rendering opportunity.

@Zaid-Ajaj
Copy link
Member

We can try integrate Axios with elmish instead of just Fable.PowerPack as Axios supports cancelling requests with CancelTokens

@forki
Copy link
Contributor Author

forki commented Oct 30, 2017 via email

@Zaid-Ajaj
Copy link
Member

If I understand correctly, you want either of two things:

  • Throttling: do not initiate a new request if an older request is still in progress or has intiated before some given timestamp.
  • Cancelling: when a new request comes in, cancel the one in-progress and keep the new request.

You can achieve the first one by keeping a timestamp of the time you initiated your last request and depending on that time, you can decide whether or not to initiate a new request.

Maybe something like this:

type Model = {
// your stuff here
LastRequestStartTime : DateTime
LastRequestFinished: bool
}

Type Msg = 
  | StartLoadingData
  | ActuallyMakeRequest
  | RequestSuccess
  | RequestFailed

Your subscription should produce StartLoadingData and not ActuallyMakeRequest. Inside the update function you can decide whether or not to ActuallyMakeRequest:

let update msg state = 
   match msg with
   | StartLoadingData ->
      // if last request happned less than 30 seconds ago, then ignore
      if (DateTime.Now - state.LastRequestStartTime).Seconds < 30
      then state, Cmd.none
      elif not state.LastRequestFinished  
      then state, Cmd.none
      else { state with LastRequestStartTime = DateTime.Now;
                               LastRequestFinished = false }, Cmd.ofMsg ActuallyMakeRequest
   | ActuallyMakeRequest ->
      state, Cmd.ofPromise (yourPromiseHere)
   | RequestSuccess -> 
       // do stuff
       { state with  LastRequestFinished = true }, Cmd.none
   | RequestFailed ->
        // do stuff
       { state with  LastRequestFinished = true }, Cmd.none

Here I used both a boolean value and a timestamp to decide whether or not to start a new request, you can use either. Hope this helps

@alfonsogarciacaro
Copy link
Contributor

If you really need cancellation of an asynchronous operation, you can use Async (which supports cancellation in Fable) instead of Promise and then start it with Async.StartAsPromise.

If you actually want throttling (or debouncing, there's some confusion with the term) maybe something similar to this Elmish debugger helper can be useful. I think at the end the idea is basically the same as what @Zaid-Ajaj proposed above :)

@et1975
Copy link
Member

et1975 commented Nov 10, 2017

Closing: this is a task-level optimization, see #123 for the discussion about the role of Cmd in this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants