Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improved performance dirty selector #84

Closed
wants to merge 2 commits into from

Conversation

ziykon
Copy link
Contributor

@ziykon ziykon commented Sep 16, 2015

A rescan (asp.net updatepanel) would take 7-8 seconds to complete. For every field the dirty selector is used on the form element and the dirty selector uses the dirtyignored selector which is quite heavy. Changed the dirty selector to check for the dirty class first, then only check for dirtyignored on dirty fields.

When doing a rescan (asp.net updatepanel), the application would take
7-8 seconds before rendering the content. For every field the dirty
selector is used on the form element and the dirty selector uses the
dirtyignored selector which is quite heavy. Changed the dirty selector
to check for the dirty class first, then only check for dirtyignored on
dirty fields.
Bug in the previous commit; need to check for dirty class on non-ignored
fields
@ziykon
Copy link
Contributor Author

ziykon commented Sep 17, 2015

You are correct. I have now made the change.

@NightOwl888
Copy link
Collaborator

Actually, after some experimentation it turns out I was wrong. jQuery calls the selector function one time for each element. So your original logic would have worked.

However, I asked this question on Stack Overflow to see if I could get a better solution than this. Then after contemplating it some more, I am reconsidering combining selectors at all.

The :dirty selector is only used in 2 different places in the source code, the isDirty method and the setDirtyStatus method. Since jQuery is very good at combining selectors, there is little point in filtering them both in the same selector. That way they could be applied in the proper order, :dirty:not(:dirtyignored), and even better optimized using (':dirty').not(':dirtyignored') without any trouble. :dirty then becomes a complete list of everything with the dirty CSS class, without excluding any ignored fields.

Thoughts?

@ziykon
Copy link
Contributor Author

ziykon commented Sep 17, 2015

The problem with that is if someone uses the :dirty selector outside of the dirtyforms plugin. Alternatively, you could internally use ('.' + dirtyForms.dirtyClass).not(':dirtyignored'), which might also be faster than using a custom selector (I'm no jQuery expert, but I'm guessing there is some optimization there)?

@NightOwl888
Copy link
Collaborator

The problem with that is if someone uses the :dirty selector outside of the dirtyforms plugin

True. But I don't consider it a "problem" so much as reverting back to the 1.x behavior. Since 2.x is still in beta, I don't consider this breaking change to be a huge issue, but I will make note of it in the Releases.

What I do see as a huge issue is the performance impact of trying to combine the selectors internally, since jQuery does a much better job by applying them one selector at a time. I made an incorrect assumption about the way selectors work, so it is best to fix the performance based on correct assumptions while in beta.

Alternatively, you could internally use ('.' + dirtyForms.dirtyClass).not(':dirtyignored'), which might also be faster than using a custom selector (I'm no jQuery expert, but I'm guessing there is some optimization there)?

Like I said, jQuery is optimized for combining the selectors itself, not from within the selector. I did some testing and indeed the .not() operator doubles performance over the :not( selector in this case.

Use Case

Specifically, why are you calling rescan? To add forms/fields dynamically or to update the dirty status of the forms/fields?

The reason I ask is that there are really two operations going on there, but originally I didn't think it would be worth the extra bytes in the download to split them up into separate methods. But now I see they perform very differently, and perhaps a sync method should be made to sync up the status in a separate operation.

There are really only 2 reasons to call rescan:

  1. When you have added a form dynamically to the page, and want to include it in the Dirty Forms check.
  2. When you want to sync the dirty class with the actual dirty/clean status of the fields/descendant fields of the selector.

Since you said you are using ASP.NET, you can pretty much rule out the first reason, because only 1 form per page is supported. So that leaves the second option, which is only useful if you are styling the fields with CSS so you can visually see the dirty vs clean fields.

Dirty Forms will automatically add dynamically added fields to the check as soon as the user sets focus to them (one at a time), so there is no reason to call rescan for dynamically added fields (which would add them all at once).

Also, if you dynamically change any of the ignore selectors on the page, they are automatically taken into account during the isDirty check. So, the only time you have to worry about it is if you do something like put an ignoreClass on a parent container of a field that is dirty and want that field to immediately be styled via CSS as "clean".

Do note you can drastically improve performance by being specific with the selector used when calling rescan:

$('#myDiv').dirtyForms('rescan');

is much faster than

$('form').dirtyForms('rescan');

You can also get better performance if you are using any helpers by excluding the check for them if they don't apply to your updatepanel.

$('#myDiv').dirtyForms('rescan', false, true);

Finally, if you are trying to update the dirty status of your fields after the round-trip to the server, you might be better off writing code on the server-side that injects the proper data-df-orig="current value" and class="dirty" values rather than calling rescan.

@ziykon
Copy link
Contributor Author

ziykon commented Sep 19, 2015

I've done some more research, and I've figured out why I needed the rescan.

We use Select2 for dropdowns, and the dropdowns loaded by the update panel do not receive the focus event for some reason, hence no original value is set on the element. Also, I have a helper for the SharePoint PeopleEditor control, which sets the original value during rescan. Both of these issues can probably be solved differently, but using a rescan seems like the easiest solution. :)

We are implementing this on a large existing application so we prefer making the code as generic as possible without any per element code, like setting original values server side, rescanning specific elements etc.

Because we use UpdatePanel which replaces the original elements, I’ve implemented the following solution:

            var dirtyCollection = [];
            var postbackElement;

Sys.WebForms.PageRequestManager.getInstance().add_beginRequest(function(sender, args) {
                    postbackElement = args.get_postBackElement();
                });

                Sys.WebForms.PageRequestManager.getInstance().add_pageLoading(function (sender, args) {
                    if (typeof(postbackElement) === "undefined") {
                        return;
                    } 

                    if ($(postbackElement).attr('type') === 'submit') {
                        return;
                    }

                    while (dirtyCollection.length) {
                        dirtyCollection.pop();
                    }
                    $('form :dirty').each(function(idx, el) {
                        var element = $(el);
                        var item = { id: element.attr('id') };
                        if (element.hasData('df-orig')) {
                            item.original = element.data('df-orig');
                        }
                        if (element.hasData('df-empty')) {
                            item.empty = element.data('df-empty');
                        }

                        dirtyCollection.push(item);
                    });
                });

                Sys.WebForms.PageRequestManager.getInstance().add_endRequest(function(sender, args) {
                    if (args.get_response().get_aborted()) {
                        return;
                    }

                    $(dirtyCollection).each(function(idx, item) {
                        var element = $('#' + item.id);
                        if (item.original !== undefined) {
                            element.data('df-orig', item.original);    
                        }
                        if (item.empty !== undefined) {
                            element.data('df-empty', item.empty);    
                        }
                    });

                    $('form').dirtyForms('rescan');
                });

It’s probably not perfect, but it seems to solve it for us, except for the slow rescan that is. :) With my original change on the :dirty selector, the performance is sufficient for our use, but I’m happy to contribute in any way I can.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants