Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

panic: concurrent deque entered unreachable code #11

Closed
eeczw opened this issue Aug 9, 2023 · 4 comments
Closed

panic: concurrent deque entered unreachable code #11

eeczw opened this issue Aug 9, 2023 · 4 comments
Assignees
Labels
bug Something isn't working
Milestone

Comments

@eeczw
Copy link

eeczw commented Aug 9, 2023

version: 0.10.1
problem: I'm using mini-moka to build an in-memeroy cache for a highly concurrent online service. And I've observed occasional panics caused by this lately:

_ => unreachable!(),

I'd appreciate it if you could spare some time to fix this. Thanks!

Here is a part of the backtrace:

#5  0x55db18df8ae0 in rust_begin_unwind at /rustc/864bdf7843e1ceabc824ed86d97006acad6af643/library/std/src/panicking.rs:617
#6  0x55db187906d0 in core::panicking::panic_fmt::haa55128da9cd75d4 at /rustc/864bdf7843e1ceabc824ed86d97006acad6af643/library/core/src/panicking.rs:67
#7  0x55db18790890 in core::panicking::panic::hb4c75d9c5b922684 at /rustc/864bdf7843e1ceabc824ed86d97006acad6af643/library/core/src/panicking.rs:117
#8  0x55db18c729f0 in mini_moka::common::concurrent::deques::Deques<K>::move_to_back_ao::h6634499d6c51838b at /root/.cargo/registry/src/rsproxy.cn-8f6827c7555bfaf8/mini-moka-0.10.1/src/common/concurrent/deques.rs:
#9  0x55db18c2f330 in <mini_moka::sync::base_cache::Inner<K,V,S> as mini_moka::common::concurrent::housekeeper::InnerSync>::sync::h65edb4f953bc2fa3 at /root/.cargo/registry/src/rsproxy.cn-8f6827c7555bfaf8/mini-moka-0.10.1/src/sync/base_cache.rs:667
#10  0x55db18c43110 in mini_moka::sync::base_cache::BaseCache<K,V,S>::get_with_hash::{{closure}}::h1faf687a57deb0fd at /root/.cargo/registry/src/rsproxy.cn-8f6827c7555bfaf8/mini-moka-0.10.1/src/sync/base_cache.rs:153
#11  0x55db18c2baf0 in mini_moka::sync::cache::Cache<K,V,S>::get::haf41a911c172cc48 at /root/.cargo/registry/src/rsproxy.cn-8f6827c7555bfaf8/mini-moka-0.10.1/src/sync/cache.rs:421
@tatsuya6502 tatsuya6502 self-assigned this Aug 9, 2023
@tatsuya6502 tatsuya6502 added the bug Something isn't working label Aug 9, 2023
@tatsuya6502
Copy link
Member

Thank you for reporting the issue. I checked if there is any code that will cause this panic, but I could not find any. I started to making some changes anyway here #15 but I am not sure if it could fix the panic.

I also want to reproduce the panic locally. I have some questions for you.

  1. Do you set the max capacity of the cache?
    • Cache::new(max_capacity) or Cache::builder().max_capacity(max_capacity)
  2. Do you set TTI and/or TTL to the cache?
    • Cache::builder().set_time_to_idle(tti).set_time_to_live(ttl)
  3. Do you set a weigher?
    • Cache::builder().weigher(...)
  4. Do you update already cached entries using insert?
  5. Do you invalidate entry by invalidate(&key)?
  6. Do you invalidate all entries by invalidate_all()?

@eeczw
Copy link
Author

eeczw commented Aug 12, 2023

Thanks for your effort to investigate this problem! I was kind of worried that I didn't provide enough information and I didn't know what else can I report back then.

To answer your questions:

  1. Yes.
  2. Yes.
  3. No.
  4. Yes.
  5. Yes.
  6. No.

@tatsuya6502 tatsuya6502 added this to the v0.10.2 milestone Aug 12, 2023
@tatsuya6502
Copy link
Member

Thank you for answering. I confirmed that these answers support "A possible steps to reproduce" in the descriptions of #15. So I believe #15 should fix the panic you had.

I will merge #15 soon and publish v0.10.2 to crates.io.

@tatsuya6502
Copy link
Member

Published v0.10.2 with the fix to crates.io.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants