You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There is a bug in the function find_interior_node_in_bucket() in hash_table.h that prevents the hash table from finding some nodes.
When iterating through the memory pages that make up a hash table bucket the loop stops one element too early:
When the page contains only the node we're looking for then pageEndIndex == pindex + nodeSize. However the bounds check if statement will automatically return not found due. This can be easily fixed by changing the comparison operator:
if (pindex + nodeSize > pageEndIndex)
return0xFFFFFFFF;
The second issue is the loop itself. At this point the size of the node being searched for has been subtracted from pageEndIndex which now points to the final index in the page that may contain the node without going out of bounds. Hence the comparison operator in the if statement is also wrong and should be greater than or equal:
if (pindex + nodeSize >= pageEndIndex)
return0xFFFFFFFF;
The text was updated successfully, but these errors were encountered:
There is a bug in the function
find_interior_node_in_bucket()
inhash_table.h
that prevents the hash table from finding some nodes.When iterating through the memory pages that make up a hash table bucket the loop stops one element too early:
When the page contains only the node we're looking for then
pageEndIndex == pindex + nodeSize
. However the bounds checkif
statement will automatically return not found due. This can be easily fixed by changing the comparison operator:The second issue is the loop itself. At this point the size of the node being searched for has been subtracted from
pageEndIndex
which now points to the final index in the page that may contain the node without going out of bounds. Hence the comparison operator in theif
statement is also wrong and should be greater than or equal:The text was updated successfully, but these errors were encountered: