You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
An ETS ram table can fail if the amount of RAM it uses is very big (probably more than 128GB acordingly to @sverker).
Describe the solution you'd like
A higher limit - preferable high enough that the amount of RAM on very big systems (like heavy AI systems with TB of RAM) will not be an issue in the forseable future.
Describe alternatives you've considered
No alternatives - just a bigger storage than 32-bit signed integer for total number of buckets.
@inforista
Actually, with this ETS hash implementation - are there “edge” cases there can break the ETS table? For ETS set tables … in principal can you have an unlimited size table, where only limit is the RAM size of the system? Like TB of RAM … will the ETS set table still work in this extreme case? Or are there physicial limitations I need to concider for really extreme use of ETS set tables?
@sverker
"Well, as you ask, I checked, and it seems we keep the total number of buckets in a 32-bit signed integer. So, if you can fill a table with more than 2 billion keys you might hit that limit. If I calculate correctly that would demand an absolute minimum of 128 Gb of memory.
That is easy to fix, and we should probably do it."
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
An ETS ram table can fail if the amount of RAM it uses is very big (probably more than 128GB acordingly to @sverker).
Describe the solution you'd like
A higher limit - preferable high enough that the amount of RAM on very big systems (like heavy AI systems with TB of RAM) will not be an issue in the forseable future.
Describe alternatives you've considered
No alternatives - just a bigger storage than 32-bit signed integer for total number of buckets.
Additional context
From nice dialoge on erlangforums with @sverker
@inforista
Actually, with this ETS hash implementation - are there “edge” cases there can break the ETS table? For ETS set tables … in principal can you have an unlimited size table, where only limit is the RAM size of the system? Like TB of RAM … will the ETS set table still work in this extreme case? Or are there physicial limitations I need to concider for really extreme use of ETS set tables?
@sverker
"Well, as you ask, I checked, and it seems we keep the total number of buckets in a 32-bit signed integer. So, if you can fill a table with more than 2 billion keys you might hit that limit. If I calculate correctly that would demand an absolute minimum of 128 Gb of memory.
That is easy to fix, and we should probably do it."
The text was updated successfully, but these errors were encountered: