Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add batching #295

Merged
merged 18 commits into from Apr 14, 2017
Merged

Add batching #295

merged 18 commits into from Apr 14, 2017

Conversation

nhunzaker
Copy link
Contributor

This commit exposes control over the release process so that changes can be batched together. It also ensures that this does not break history.wait().

Additionally, it updates the canvas example to use a requestIdleCallback batching strategy. This improves write throughput about 60-80%, but I'm mostly looking forward to preventing duplicate renders for loading states that finish really quickly.

http://probable-pocket.surge.sh/

@codecov-io
Copy link

codecov-io commented Apr 12, 2017

Codecov Report

Merging #295 into master will not change coverage.
The diff coverage is 100%.

Impacted file tree graph

@@          Coverage Diff          @@
##           master   #295   +/-   ##
=====================================
  Coverage     100%   100%           
=====================================
  Files          22     23    +1     
  Lines         802    820   +18     
=====================================
+ Hits          802    820   +18
Impacted Files Coverage Δ
src/history.js 100% <100%> (ø) ⬆️
src/default-update-strategy.js 100% <100%> (ø)
src/microcosm.js 100% <100%> (ø) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 19ee6ec...a10c30b. Read the comment docs.

}, options)
}
}
}
Copy link
Contributor Author

@nhunzaker nhunzaker Apr 13, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be cool for this to be a microcosm-batch-updates module, or something. I imagine there's really only one way to do this.

@nhunzaker nhunzaker changed the base branch from initial-state-update to master April 13, 2017 11:56
Microcosm is [Flux](https://facebook.github.io/flux/) with first-class
actions and state sandboxing.
The Microcosm class provides a centralized place to store application
state, dispatch actions, and track changes.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am wondering if we should keep the reference to Flux for the "This is like X" aspect of documentation, otherwise this is an awesome change. 👍

you provide to Microcosm are passed into the `setup` lifecycle method:

```javascript
import Autosave from './effects/autosave'
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can this example code demonstrate the concept without introducing effects? to keep the section as simple as possible

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm. Good idea.

@@ -0,0 +1,132 @@
# Batch Updates
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this doc is really good, 👏

/**
* The central tree data structure that is used to calculate state for
* a Microcosm. Each node in the tree is an action. Branches are
* changes over time.
* @constructor
*/
export default function History (limit) {
export default function History (config) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

with this change, would it make sense to warn when config is not an object?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think so. History is never instantiated directly, it always passes through new Microcosm()

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh I see, sounds good. somehow I thought I was looking at the Microcosm constructor

Copy link
Member

@cwmanning cwmanning left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

@nhunzaker
Copy link
Contributor Author

nhunzaker commented Apr 13, 2017

@cwmanning good call on the test. I discovered a bug where you could get stuck in wait() if you waited during a release.

Which made me realize... we don't need to request any more "updates" if the last update hasn't fired yet. That can reduce the complexity of the updater significantly.

Instead of doing:

// A lot of browsers don't support requestIdleCallback, so we patch it
import 'ric'

// Never let the user wait more than 24 milliseconds for an update
const options = { timeout: 24 }

export default function requestIdleBatch () {

  // Batching strategies return a function. This allows you to
  // maintain state within the closure above. Here, we keep track of
  // the last frame of work
  let frame = null

  return update => {
    if (frame == null) {
      frame = requestIdleCallback(() => {
        frame = null
        update()
      }, options)
    }
  }
}

We could just do:

// A lot of browsers don't support requestIdleCallback, so we patch it
import 'ric'

// Never let the user wait more than 24 milliseconds for an update
const options = { timeout: 24 }

export default function requestIdleBatch () {
  return update => requestIdleCallback(update, options)
}

I still want to return a function, because it would allow you to do things like setup logging or some other system. But you don't have to keep track of prior frames, which is really nice.

What do you think?

@nhunzaker
Copy link
Contributor Author

@cwmanning technically now, you could also just do:

export default function requestIdleBatch () {
  return update => setTimeout(update, 24)
}

@cwmanning
Copy link
Member

@nhunzaker I like the simplified updater example. Might include the really simple one (setTimeout) in the example code as well, for those who may opt to avoid a polyfill.

@nhunzaker
Copy link
Contributor Author

@cwmanning Done, what do you think?

@nhunzaker nhunzaker self-assigned this Apr 13, 2017
@nhunzaker nhunzaker requested a review from leobauza April 13, 2017 14:07
@nhunzaker
Copy link
Contributor Author

@leobauza Do these documentation updates feel good? Anything we should change?

Copy link
Member

@cwmanning cwmanning left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

@efatsi
Copy link
Contributor

efatsi commented Apr 13, 2017

Neat stuff, curious what peoples' thoughts are around simplifying the interface for this, and just exposing a configurable batch delay option that would instantiate an in-house updater. Reading through the Batch Updates docs is neat, but by the end of the docs I got the sense that there's really an optimal way to do things (use requestIdleCallback). Worth flushing this out a little more and replacing the idea of defining your own updater with just defining a batch interval? Or are there other benefits to defining your own updater that I'm not picking up on.

test("it increases the number when the stepper is clicked", function () {
let app = mount(<App />)

app.find('#stepper).simulate('click')
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should be app.find('#stepper') (missing closing quote mark after #stepper)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you!

@mackermedia
Copy link
Contributor

👍 Those docs are 💯 . Also, I think @efatsi is on to something with providing a simple mechanism to use the default recommended batch update operation.

@nhunzaker
Copy link
Contributor Author

@mackermedia @efatsi Hmm, what do you think about making this option batch: true, and if it's a function, use that function instead?

@nhunzaker
Copy link
Contributor Author

nhunzaker commented Apr 13, 2017

batch:true with the caveat that you'd need to polyfill requestIdleCallback

@mackermedia
Copy link
Contributor

i was thinking batch: true would do the ric method and you wouldn't have to think about polyfilling (microcosm happily does that for you).
or maybe also if batch: 12 is a number it'll do the setTimeout method? is there any value to that method and if so, it might be nice for it to be a simple config option.
or you could also provide your own function?
overboard?

@efatsi
Copy link
Contributor

efatsi commented Apr 13, 2017

agreeing with @mackermedia, passing in functions allows for greater control but requires more underlying knowledge of microcosm, mainly - what's passed in to that function. My thinking would be something like batchInterval: 12.

Copy link
Contributor

@leobauza leobauza left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

Microcosm is [Flux](https://facebook.github.io/flux/) with first-class
actions and state sandboxing.
The Microcosm class provides a centralized place to store application
state, dispatch actions, and track changes.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am wondering if we should keep the reference to Flux for the "This is like X" aspect of documentation, otherwise this is an awesome change. 👍

@nhunzaker
Copy link
Contributor Author

@efatsi, @mackermedia Here's my thinking. I want a safe default, with an escape hatch. I think we can do this without exposing a lot of complexity.

I think we need to add 2-3 options:

batch: true
batchInterval: 24, // ? Maybe... is this useful?
updater: buildInUpdater

The updater function will get the options passed into Microcosm. Our default updater function would look like:

function requestIdleBatch (options) {
  if (options.batch) {
    return update => requestIdleCallback(update, { timeout: options.batchInterval })
  } else {
    return update => update()
  }
}

This allows you:

  1. To just set batch: true. End of story
  2. To configure the batch interval (is it an interval, or a timeout?)
  3. To configure a custom updater.
  4. To conditionally apply that batch updater based on environment

I'm still debating whether or not we need batchInterval. I think a timeout of 24 would be exactly what I would do on any given project, and tweaking that timeout under the hood wouldn't be a breaking change (unless we go from like 24ms to 200ms).

What do you think?

@mackermedia
Copy link
Contributor

Honestly, I don't really know of the use-case of when I'd reach for it. It sounds like when the app starts to get janky from a lot of actions resolving and re-rendering when I want an animation to finish.
I think the batch: true and updater: myUpdateFunction are probably all that's necessary for now?
Keep it simple?

@nhunzaker
Copy link
Contributor Author

@mackermedia Yep. That's precisely the use case. When you have a lot of actions firing, causing meaningful changes, and the successive change events cause expensive repaints.

Just having batch and updater sound good to me.

@efatsi
Copy link
Contributor

efatsi commented Apr 14, 2017

👍 to batch and updater, and then to this PR

@nhunzaker
Copy link
Contributor Author

batch and updater it is.

@nhunzaker nhunzaker merged commit a10c30b into master Apr 14, 2017
@nhunzaker nhunzaker deleted the batching branch April 17, 2017 18:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants