Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LoadingCache doesn't get along well with interrupted threads #1122

Open
gissuebot opened this issue Oct 31, 2014 · 12 comments
Open

LoadingCache doesn't get along well with interrupted threads #1122

gissuebot opened this issue Oct 31, 2014 · 12 comments
Assignees

Comments

@gissuebot
Copy link

Original issue created by dterrell on 2012-08-28 at 06:20 PM


two separate problems here:

  1. When a thread is interrupted while blocking on another thread's cache load, it swallows the interrupt and continues to block. This makes no sense and is not best practice; Future.get() throws InterruptedException for a reason.

  2. When a CacheLoader is interrupted and throws InterruptedException and/or leaves the interrupted status set while throwing another exception (we tend to turn it into our own Runtime InterruptedException), that exception is propagated to any other thread blocking on the cache load instead of letting one of them attempt to load.

I think these are both mistakes.

@gissuebot
Copy link
Author

Original comment posted by cpovirk@google.com on 2012-08-28 at 06:37 PM


  1. I'm resistant to the idea that a method is obligated to be interruptible just because it takes a long time or calls wait() somewhere, but I'm sympathetic to throwing IE here. Given my mixed feelings, I know I discussed this at some point, but I can't find record of this in my email, so it may have happened in person or in some other format. Anyway, I think it came down to making LoadingCache as transparent as possible. If a thread makes the first call to get() for a key, then interruption will have no effect. In some cases, in fact, it's known that only one thread will ever make calls to get(), in which case "throws InterruptedException" would be entirely bogus. When multiple threads are calling get(), it's odd for interruption to sometimes interrupt the call and sometimes not. This isn't clearly the right decision, but it's what we went with. If it's any consolation, the future AsyncLoadingCache will provide a getFuture() method, and Future interruption will work properly.

  2. Yes. We have an internal bug filed for this, but we haven't done any work to fix it. The cache code is complex enough that any significant change is risky, so we've been waiting to hear from someone who encountered this in the wild. Have you? Again, the future AsyncLoadingCache will work better.


Status: Research
Labels: Type-Defect

@gissuebot
Copy link
Author

Original comment posted by wasserman.louis on 2012-08-28 at 10:48 PM


I'm personally very supportive of continuing to suppress exceptions on getUnchecked, but I can sympathize with the desire to be able to interrupt a cache load.

@gissuebot
Copy link
Author

Original comment posted by dterrell on 2012-09-06 at 06:14 PM


I'd be ok with throwing (a hypothetical) RuntimeInterruptedException from getUnchecked(), but it's not just getUnchecked(), get() also blocks forever.

@gissuebot
Copy link
Author

Original comment posted by cpovirk@google.com on 2012-09-06 at 06:47 PM


Somehow we have no documentation for this :\ I'm accepting this issue for at least the purpose of adding documentation. Thanks.


Status: Accepted
Labels: Milestone-Release14

@gissuebot
Copy link
Author

Original comment posted by cpovirk@google.com on 2012-09-06 at 06:48 PM


(No comment entered for this change.)


Labels: Type-ApiDocs

@gissuebot
Copy link
Author

Original comment posted by kevinb@google.com on 2012-10-15 at 08:24 PM


(No comment entered for this change.)


Labels: -Type-Defect

@gissuebot
Copy link
Author

Original comment posted by kak@google.com on 2012-10-23 at 04:48 PM


(No comment entered for this change.)


Owner: lowasser@google.com

@gissuebot
Copy link
Author

Original comment posted by wasserman.louis on 2013-01-29 at 06:47 PM


(No comment entered for this change.)


Labels: -Milestone-Release14

@gissuebot
Copy link
Author

Original comment posted by kevinb@google.com on 2013-03-12 at 06:43 PM


(No comment entered for this change.)


CC: fry@google.com

@gissuebot
Copy link
Author

Original comment posted by cpovirk@google.com on 2013-03-13 at 03:06 PM


I've written this up in a little more detail at <http://code.google.com/p/guava-libraries/wiki/CachesExplained#Interruption>. I don't think there's anything new to you there, but I'm hoping it's a gentler introduction for those who hadn't already identified the problem themselves.

I've submitted a CL internally (to be mirrored out soon) that links from the LoadingCache Javadocs to that Wiki page.

We may yet do something about problem #2, but that's beyond my capabilities at this point.

@gissuebot
Copy link
Author

gissuebot commented Nov 1, 2014

Original comment posted by s...@molindo.at on 2013-03-21 at 10:44 AM


I was working on the functionality of an "AsyncLoadingCache" when I stumbled across this issue (which seems to be the only mention of this future class). I think I've found a neat workaround for this issue and the AsyncLoadingCache itself: By putting an ExecutorService (could be a SameThreadExecutorService!) inside a CacheLoader<K, Future<V>> implementation (see https://gist.github.com/sfussenegger/5212107 for details). However, it's that simple that I'm wondering if I'm missing the whole point here :) Could you please comment?

@gissuebot
Copy link
Author

Original comment posted by cpovirk@google.com on 2013-03-21 at 03:25 PM


I realized that I no longer remembered what the point was, so I did some research :) I've filed issue 1350 with some of my results. It can serve as a forum for general AsyncLoadingCache discussions. As you'll see there, LoadingCache<K, Future<V>> works quite well in the simple cases and less well thereafter.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants