You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was wondering about the following two aspects concerning load factor:
Why is it a double? is it such precision needed? I'm looking at it more from the space perspective, these extra 4 bytes I don't think are needed, even for comparison sake (I'm not sure this one is important TBH; comparing floats vs doubles), specially if you are serializing that Set or Map, these 4 extra bytes are more of a burden than helpful.
1.0 load factor, it sounds stupid to have 1.0 as load factor but sometimes a Set can be use only as a wire-transport where hash collision -and its performance implication- is not important but memory occupation of that particular Set or Map, another reason is that when load factor is not important but space the user to go around your 1.0 restriction will set it to 0.99 so in the end I don't think it will have the effect you intended it to have.
The text was updated successfully, but these errors were encountered:
You are right. There is no need for a double. It could be a float. If you add an issue I'll make the change.
If the load factor is 1.0 then there will be no gaps and thus probing would become an infinite loop without adding an extra check for detecting a full loop.
I was wondering about the following two aspects concerning load factor:
Set
orMap
, these 4 extra bytes are more of a burden than helpful.Set
can be use only as a wire-transport where hash collision -and its performance implication- is not important but memory occupation of that particularSet
orMap
, another reason is that when load factor is not important but space the user to go around your 1.0 restriction will set it to 0.99 so in the end I don't think it will have the effect you intended it to have.The text was updated successfully, but these errors were encountered: