Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(core) remove a memory leak introduced in the DB caching #3278

Merged
merged 4 commits into from
Mar 10, 2018

Conversation

thibaultcha
Copy link
Member

@thibaultcha thibaultcha commented Mar 8, 2018

This fixes a regression introduced in de6390e and released in 0.12.2 and 0.13.0rc1. It was reported in #3277 and investigated by @hishamhm and myself.

This PR takes a conservative approach at fixing the underlying issues: we introduce 3 commits from less to most disruptive, in an attempt to release a safe 0.12.3 hotfix:

  • avoid generated-on-the-fly cache callback closure in the router cache
  • use a different caching key for soft_to_hard_error wrapping closure that does not prevent the underlying callback from being GC'ed.
  • backport an upstream mlcache feature to get rid of the (temporary) soft_to_hard_error wrapper workaround altogether (thibaultcha/lua-resty-mlcache@4fdfbb9)

Part of the fix for #3277 for a regression introduced in
de6390e.

More details included in the next commit.
Part of the fixes for #3277 for a regression introduced in
de6390e.

While the soft_to_hard error wrapper tries its best to be GC-friendly by
not keeping references on the DB cache callbacks as they are used as keys
to index their corresponding soft_to_hard_error wrappers.

Unfortunately, because the created closure holds a reference to the
callback itself, said callback cannot be garbage-collected.

This is especially troublesome for callbacks that are generated on the
fly, like so:

    local cache, err = cache:get("key", opts, function()
        return "cached value"
    end)

The above anonymous function should be garbage-collected, but the
soft_to_hard_error still holds a reference to it.
This example could be observed prior to the parent commit, in which we
replaced the cache callback to not create a function on the fly anymore.

The drawback of this change is a larger memory footprint, the same
cache callback will result in several copies of the same
soft_to_hard_error function, if they are invoked with several caching
keys.
This is part of the fixes for #3277 for a regression introduced in
de6390e.

Includes a port of the following mlcache commit:

thibaultcha/lua-resty-mlcache/commit/4fdfbb976a9b4cda0f277ffc71294a14481a9c4d

Supporting callback-returned errors allows us to get rid of the
temporary workaround with a wrapper to throw errors on callback-returned
errors.
@hishamhm
Copy link
Contributor

hishamhm commented Mar 9, 2018

@thibaultcha fixes look great! I added a regression test to the PR.

Copy link
Member

@Tieske Tieske left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

@thibaultcha thibaultcha merged commit 6abacf0 into master Mar 10, 2018
@thibaultcha thibaultcha deleted the fix/leak-3277 branch March 10, 2018 00:55
@kswen
Copy link

kswen commented Mar 14, 2018

stable version 0.13 will release today or postponed?

@hishamhm
Copy link
Contributor

@kswen Given we had another rc this week, we will give some more time for the release candidate to do the rounds.

@thibaultcha
Copy link
Member Author

@kswen Glad you asked - we fixed the new date to March, 21st earlier today. Of course, you are welcome to try it and report any issue you may find (or else 0.13.0 will risk containing unreported bugs...).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants