You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
CaffeineCache calculates the "weight" of individual entries at insertion time, using weigh function:
privateintweigh(ObjectoKey, ObjectoValue)
{
intcUnits = m_unitCalculator.calculateUnits(oKey, oValue);
if (cUnits < 0)
{
thrownewIllegalStateException(String.format(
"Negative unit (%s) for %s=%s", cUnits, oKey, oValue));
}
return (cUnits / m_nUnitFactor);
}
This is problematic, because the method above returns 0 if a unit factor is larger than the number of units for a give entry, which is often the case when unit factor is used (typical values for unit factor are 2^10 and 2^20, in order to track and limit cache size in KBs or MBs). Even if the factor is smaller than the entry size, the loss of precision due to integer arithmetic at the single entry level is significant.
To Reproduce
Configure unit factor that is larger than the average entry size
Configure BINARY unit calculator
Add some entries to the cache
Observe that the getUnits method returns zero, even though there are entries in the cache
Expected behavior getUnits methods should return the actual number of bytes consumed by all entries, divided by the unit factor.
In other words, unit factor should only be applied during final calculation for the backing map as a whole, not at the individual entry level.
Environment (please complete the following information):
Coherence CE 22.06.1
Java 11
OS: any
OS Version: n/a
The text was updated successfully, but these errors were encountered:
Describe the bug
CaffeineCache
calculates the "weight" of individual entries at insertion time, usingweigh
function:This is problematic, because the method above returns 0 if a unit factor is larger than the number of units for a give entry, which is often the case when unit factor is used (typical values for unit factor are 2^10 and 2^20, in order to track and limit cache size in KBs or MBs). Even if the factor is smaller than the entry size, the loss of precision due to integer arithmetic at the single entry level is significant.
To Reproduce
getUnits
method returns zero, even though there are entries in the cacheExpected behavior
getUnits
methods should return the actual number of bytes consumed by all entries, divided by the unit factor.In other words, unit factor should only be applied during final calculation for the backing map as a whole, not at the individual entry level.
Environment (please complete the following information):
The text was updated successfully, but these errors were encountered: