Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Underflow error when size of pinned queries is larger than CACHE_MAX_SIZE #1234

Open
hannahbast opened this issue Jan 20, 2024 · 1 comment

Comments

@hannahbast
Copy link
Member

Since #1067, memory sizes are no longer specified as integers, but with units, and they have a proper type in QLever.

However, that change gave rise to an error that did not occur before (or went unnoticed). Namely, when the size of the pinned queries is larger than CACHE_MAX_SIZE, then any further query results in the following error message:

Underflow error: Subtraction of the two given 'MemorySize's is not possible. It would result in a size_t underflow.

There are two possible fixes (and maybe it makes sense to do both):

  1. Don't allow pinning of a query when the total size of the pinned queries would then exceed CACHE_MAX_SIZE. This would also be consistent with the fact that it is already not allowed now that a query is pinned, the size of which exceeds CACHE_MAX_SIZE_SINGLE_ENTRY.

  2. Catch the underflow error above and return a proper error message (something along the lines of: no memory left; we already have such a message in other places).

@schlegan @joka921 Can you have a look and what do you think?

@schlegan
Copy link
Contributor

Alright, I took a look and found the only line, which could be responsible for this error in src/util/Cache.h:343:

if (_maxSize - _totalSizePinned < sizeToMakeRoomFor) {
      return false;
}

I'm definitely for the first solution (Don't allow pinning of a query when the total size of the pinned queries would then exceed CACHE_MAX_SIZE.) and I was under the impression, that this was already the case.

After all, the documentation of the FlexibleCache says:

@brief Associative array for almost arbitrary keys and values that acts as a cache with fixed memory capacity.

What would be the point of a cache with a maximum capacity, if the maximum capacity is ignored?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants