New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improved performance dirty selector #84
Conversation
When doing a rescan (asp.net updatepanel), the application would take 7-8 seconds before rendering the content. For every field the dirty selector is used on the form element and the dirty selector uses the dirtyignored selector which is quite heavy. Changed the dirty selector to check for the dirty class first, then only check for dirtyignored on dirty fields.
Bug in the previous commit; need to check for dirty class on non-ignored fields
You are correct. I have now made the change. |
Actually, after some experimentation it turns out I was wrong. However, I asked this question on Stack Overflow to see if I could get a better solution than this. Then after contemplating it some more, I am reconsidering combining selectors at all. The Thoughts? |
The problem with that is if someone uses the |
True. But I don't consider it a "problem" so much as reverting back to the 1.x behavior. Since 2.x is still in beta, I don't consider this breaking change to be a huge issue, but I will make note of it in the Releases. What I do see as a huge issue is the performance impact of trying to combine the selectors internally, since jQuery does a much better job by applying them one selector at a time. I made an incorrect assumption about the way selectors work, so it is best to fix the performance based on correct assumptions while in beta.
Like I said, jQuery is optimized for combining the selectors itself, not from within the selector. I did some testing and indeed the Use CaseSpecifically, why are you calling The reason I ask is that there are really two operations going on there, but originally I didn't think it would be worth the extra bytes in the download to split them up into separate methods. But now I see they perform very differently, and perhaps a There are really only 2 reasons to call
Since you said you are using ASP.NET, you can pretty much rule out the first reason, because only 1 form per page is supported. So that leaves the second option, which is only useful if you are styling the fields with CSS so you can visually see the dirty vs clean fields.
Do note you can drastically improve performance by being specific with the selector used when calling $('#myDiv').dirtyForms('rescan'); is much faster than $('form').dirtyForms('rescan'); You can also get better performance if you are using any $('#myDiv').dirtyForms('rescan', false, true); Finally, if you are trying to update the dirty status of your fields after the round-trip to the server, you might be better off writing code on the server-side that injects the proper |
I've done some more research, and I've figured out why I needed the rescan. We use Select2 for dropdowns, and the dropdowns loaded by the update panel do not receive the focus event for some reason, hence no original value is set on the element. Also, I have a helper for the SharePoint PeopleEditor control, which sets the original value during rescan. Both of these issues can probably be solved differently, but using a rescan seems like the easiest solution. :) We are implementing this on a large existing application so we prefer making the code as generic as possible without any per element code, like setting original values server side, rescanning specific elements etc. Because we use UpdatePanel which replaces the original elements, I’ve implemented the following solution: var dirtyCollection = [];
var postbackElement;
Sys.WebForms.PageRequestManager.getInstance().add_beginRequest(function(sender, args) {
postbackElement = args.get_postBackElement();
});
Sys.WebForms.PageRequestManager.getInstance().add_pageLoading(function (sender, args) {
if (typeof(postbackElement) === "undefined") {
return;
}
if ($(postbackElement).attr('type') === 'submit') {
return;
}
while (dirtyCollection.length) {
dirtyCollection.pop();
}
$('form :dirty').each(function(idx, el) {
var element = $(el);
var item = { id: element.attr('id') };
if (element.hasData('df-orig')) {
item.original = element.data('df-orig');
}
if (element.hasData('df-empty')) {
item.empty = element.data('df-empty');
}
dirtyCollection.push(item);
});
});
Sys.WebForms.PageRequestManager.getInstance().add_endRequest(function(sender, args) {
if (args.get_response().get_aborted()) {
return;
}
$(dirtyCollection).each(function(idx, item) {
var element = $('#' + item.id);
if (item.original !== undefined) {
element.data('df-orig', item.original);
}
if (item.empty !== undefined) {
element.data('df-empty', item.empty);
}
});
$('form').dirtyForms('rescan');
}); It’s probably not perfect, but it seems to solve it for us, except for the slow rescan that is. :) With my original change on the :dirty selector, the performance is sufficient for our use, but I’m happy to contribute in any way I can. |
A rescan (asp.net updatepanel) would take 7-8 seconds to complete. For every field the dirty selector is used on the form element and the dirty selector uses the dirtyignored selector which is quite heavy. Changed the dirty selector to check for the dirty class first, then only check for dirtyignored on dirty fields.