Skip to content


Subversion checkout URL

You can clone with
Download ZIP


Implement a global system to control DOM size #1555

toddparker opened this Issue · 9 comments

6 participants


Memory management: how to keep the DOM from getting too big?

A new global configuration option to set the max number of pages to keep in the DOM at once. Once that max is hit, we run a a document-wide $("selector").remove() at the appropriate juncture in every changePage invocation (post Ajax success) that finds all pages that are data-cache="false" or oldest in the stack and delete them. We’ll need to check for alreadyLocalPage = ("[data-url='...']:not([data-cache='false']")

Default to a sensible number like 15-20 pages but this can be overridden to any number more than 2 (to allow for transitions) or turned off completely


After discussing this, we might want to have a simpler mechanism for the nearer term until we really understand how memory is freed up on these devices:

When you hit the max number of pages in the DOM (if specified), the next link/form submit isn't Ajax-driven so the browser does a full page refresh.


yeah, +1. Let's plan to add a check at the beginning of changepage that checks if the limit has been reached, if so, don't use Ajax.


It's weird that DOM size wasn't taken into consideration. If your site has 1,000 pages, the way jQuery Mobile works right now will surely fail.

@jblas jblas was assigned

I think the browser cache already solves this problem very well.

Let's say I want a browser to cache a page, so that future requests to that same page will be served locally from the browser cache instead of being loaded over the network. In that case I let the browser know that by adding some HTTP headers to the page (Expires On, Cache-Control, etc.). I'm assuming this works in most mobile browsers too.

One clear drawback of the current approach, that involves this issue, is that if the device becomes memory constrained, it can signal the browser to empty out it's cache, but not our page to clear out stale pages from the DOM.

I understand that keeping pages in the DOM improves transitions, but my guess is that pages served locally from the browser cache get loaded almost in an instant too.

So, the best solution in my opinion is to never keep remote pages in the DOM and let the browsers handle the client-side caching. We should then emphasize the importance of the use of HTTP caching headers and perhaps provide some sample code using popular server-side frameworks (PHP, Rails, etc.)

I'm lacking any expertise in mobile browsers, so I'm probably missing something. What do you guys think of this?


I agree. This business of storing all pages in the DOM demonstrates complete lack of foresight.


Thanks for the feedback. We're in the process of working on a system for automating page removal without removing certain types of pages that can't be refetched from the server (nested listview pages, local multipages, dynamically generated pages, etc). It's likely that we'll implement this in two ways: first, a configurable limit for the max number of pages to keep in the dom at any time (which could default to something quite low, maybe zero, who knows), and second, a simple mechanism to specify that a certain page should never be stored in the DOM, or that a certain page type should never be removed from the DOM through the max-limit flushing.

@wietsevenema makes a good point, and I generally agree that simply recommending standard cache configuration settings is a good way to go, but we'll need to test how well this plays out on mobile devices with limited capabilities, even newer ones.


Like wietsevenema and matthewdl said i also think that the best solution is to not built up a proprietary DOM cache. Control caching via the HTTP Header is common practice, most webdevelopers should be used to and the browser has a chance to manage its memory. Responses of ajax get-calls are cached by default corresponding to the http headers(pretty sure). The jqm page cache is reduntant. I think the current navigation and page cache model complicates things unnecessary.
The idea with a max number pages is really alien. Since there is no fixed size in kb for webpages and no fixed size of an sql result set no one will know how big pages can get. This feature is a game of luck...

I think you guys try to manage the browsers memory by writing proprietary javascript: This wont work.
I suggest to keep ajax handling and kick out the page cache completely.



I think this is all really great. We just need to build a prototype to test out how this could really work on the mobile devices. We agree that replicating native browser features is something we'd like to avoid.


This is now part of the DOM cache solution so closing.

@toddparker toddparker closed this
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.