Skip to content


Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP


Only store new session data if the data is non-default #161

wants to merge 1 commit into from

1 participant


Forcing every person (and web crawler) who visits the site to have a unique cookie means that every page is unique, and thus un-cacheable. Also, Googlebot crawling my site results in millions of sessions clogging the database, that all contain default data.

I've made a small change to web/ so that it only stores sessions and sets cookies if the data is different from the initializer - with that done, I can now stick Varnish in front of the site, and it correctly serves cached pages to anonymous browsers and dynamic pages to logged-in users, drastically reducing the load.

(This is a pull-request version of #158, as requested)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Commits on Jun 28, 2012
  1. @shish
This page is out of date. Refresh to see the latest.
Showing with 7 additions and 2 deletions.
  1. +7 −2 web/
9 web/
@@ -133,11 +133,16 @@ def _validate_ip(self):
return self.expired()
def _save(self):
- if not self.get('_killed'):
+ current_values = dict(self._data)
+ del current_values['session_id']
+ del current_values['ip']
+ if not self.get('_killed') and current_values != self._initializer:
self._setcookie(self.session_id)[self.session_id] = dict(self._data)
- self._setcookie(self.session_id, expires=-1)
+ if web.cookies().get(self._config.cookie_name):
+ self._setcookie(self.session_id, expires=-1)
def _setcookie(self, session_id, expires='', **kw):
cookie_name = self._config.cookie_name
Something went wrong with that request. Please try again.