Skip to content
This repository has been archived by the owner on Sep 6, 2021. It is now read-only.

Add process in background options to doSequentially #1009

Merged
merged 2 commits into from
Jul 27, 2012

Conversation

jhatwich
Copy link
Member

@jhatwich jhatwich commented Jun 7, 2012

Do sequentially can now be used to process a list of tasks in time
slices so long-running processes don't block up the UI.

I ran into the need for this in creating a related files extension.

Do sequentially can now be used to process a list of tasks in time
slices so long-running processes don't block up the UI.
@ghost ghost assigned peterflynn Jun 7, 2012
@peterflynn
Copy link
Member

Very interesting! A few questions/comments:

  • It feels like this makes doSequentially() serve two very distinct purposes: sequencing and aggregating asynchronous tasks (which happen mainly off the JS thread) plus running slices of synchronous processing (which happen entirely on the JS thread). I'm wondering if it would be cleaner to recast this as a new API that sits on top of doSequentially(), keeping doSequentially() simpler and more narrowly scoped.
  • Could you talk a bit about the use cases for sliceNow = true vs. false? In what situations would you want to delay processing for 30ms?
  • Should we expose TIME_SLICE and TIME_YIELD as optional arguments? It seems like the caller might want to finesse their values depending on use case (e.g. a long-running modal operation that only wants to unblock long enough to repaint a progress bar might prefer bigger slices, while an operation that runs continuously while the user is typing might want shorter slices or longer idle gaps).

@peterflynn
Copy link
Member

Here's an example of the notion of building this API on top of doSequentially() instead of integrated into it. (I haven't tested this code at all yet, so caveat emptor).

    function doInBackground(items, processItem, maxBlockingTime, idleTime) {
        // Argument defaults
        maxBlockingTime = maxBlockingTime || 20;
        idleTime = idleTime || 30;

        var sliceStartTime = (new Date()).getTime();

        return doSequentially(items, function (item, i) {
            var result = new $.Deferred();

            processItem(item, i);

            // If our time slice is done, pause before letting the next item in the sequence begin
            // processing. If we still have more time left, let the next item start immediately.
            if ((new Date()).getTime() - sliceStartTime >= maxBlockingTime) {
                window.setTimeout(function () {
                    sliceStartTime = (new Date()).getTime();
                    result.resolve();
                }, idleTime);
            } else {
                result.resolve();
            }

            return result;
        }, false);
    }

What do you think? Does this remain flexible enough for your (and similar) needs?

@jhatwich
Copy link
Member Author

jhatwich commented Jun 9, 2012

Making it a separate API would work fine for my needs - the optional
parameters thing doesn't bother me personally, but I can see how it might
make it more confusing to digest the API. I added it to doSequentially
because the background processing behavior is what I expected that function
to do in the first place :) Maybe "sequenceInBackground" or
"doSequentiallyInBackground"?

Exposing idleTime and maxBlockingTime is a good idea too.

The sliceNow option allows you to try to get something done sync, but defer
it if it takes too long. I'm not using that option, but as a utility it
seemed like it could be useful in some situations. I'm using it with
sliceNow false to make sure the operation I'm scheduling happens "in the
background" - for related files I didn't want to chance a performance hit
on load since there is already a lot going on. It might be better as a
milliseconds value - how long to delay the first slice (defaulting to 0)
"delayStartTime" perhaps?

  • Josh

On Fri, Jun 8, 2012 at 5:35 PM, Peter Flynn <
reply@reply.github.com

wrote:

Here's an example of the notion of building this API on top of
doSequentially() instead of integrated into it. (I haven't tested this
code at all yet, so caveat emptor).

   function doInBackground(items, processItem, maxBlockingTime, idleTime) {
       // Argument defaults
       maxBlockingTime = maxBlockingTime || 20;
       idleTime = idleTime || 30;

       var sliceStartTime = (new Date()).getTime();

       doSequentially(items, function (item, i) {
           var result = new $.Deferred();

           processItem(item, i);

           // If our time slice is done, pause before letting the next
item in the sequence begin
           // processing. If we still have more time left, let the next
item start immediately.
           if ((new Date()).getTime() - sliceStartTime >= maxBlockingTime)
{
               window.setTimeout(function () {
                   sliceStartTime = (new Date()).getTime();
                   result.resolve();
               }, idleTime);
           } else {
               result.resolve();
           }

           return result;
       }, false);
   }

What do you think? Does this remain flexible enough for your (and
similar) needs?


Reply to this email directly or view it on GitHub:
#1009 (comment)

Restore doSequentially to process the list in a blocking fashion and
add doSequentiallyInBackground that does processing in configurable
time slices.
@jhatwich
Copy link
Member Author

Sorry for the long delay. I've finally applied the refactoring to Async based on your code and it seem to be working great. I left delayStartTime out for now since I'm not using it anyway.

  • Josh

@peterflynn
Copy link
Member

@jhatwich: Sorry for my long delay as well :-) It looks good to me -- merging now.

peterflynn added a commit that referenced this pull request Jul 27, 2012
Add Async utility for background processing
@peterflynn peterflynn merged commit e763215 into adobe:master Jul 27, 2012
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants