You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[Test]
public void T080_PutGetWithoutMaxCap_key_as_string()
{
using (var cache = makeCache())
{
var tA = cache.GetOrCreateTable("A");
const int CNT = 80000000;
for (var i = 0; i < CNT; i++)
{
var pr = tA.Put(i.ToString(), "value" + i.ToString(), priority: 10);
// Console.WriteLine("{0} -> {1}", i, pr);
Assert.IsTrue(pr == PutResult.Inserted);//fails here even though key values are unique
}
}
}
The text was updated successfully, but these errors were encountered:
@rugunda this is not a bug. The cache is "speculative", that is: it does not GUARANTEE that PUT will ALWAYS insert vs replace. This is because the cache design is purposely done as a linear hash array without secondary rehashings. It is capacity-based and does not handle collisions (for efficiency). When you put a new item in cache, there is always a chance that cache slot will be occupied. To make sure that second item does not over-write the first item you can set the "priority" for the items.
This is done on purpose, as in real world the collisions are pretty rare, however the capacity of the table is auto-adjusting. So this is a probability-based solution, it does not work the same way as Dictionary, and that is why it is faster for all operations (no chaining/rehashing)
@itadapter Thanks for the clarification. I have been using the serializer, and it blows away anything else I have ever come across. Really cool components in Nfx.
[Test]
public void T080_PutGetWithoutMaxCap_key_as_string()
{
using (var cache = makeCache())
{
var tA = cache.GetOrCreateTable("A");
The text was updated successfully, but these errors were encountered: