You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
"All memory allocations are in use" error when looping through a large Astar object whilst calling get_point_position() or get_point_connections().
#54252
Closed
KnightNine opened this issue
Oct 26, 2021
· 3 comments
If you decide to add a large enough number of points (my test is with around 100,000) to an Astar object then try to loop through all the points and call the object's get_point_position() or get_point_connections() functions you will receive this error every loop:
ERROR: All memory pool allocations are in use.
At: ./core/pool_vector.h:530
ERROR: set: Index p_index = -1 is out of bounds (size() = 0).
At: ./core/pool_vector.h:494
the error printing every loop slows this down to a snail's pace considering the number of points.
Further Experimentation:
(hopefully useful though not exactly "brief")
Strangely enough get_point_position() and get_point_connections() being called on this +100,000 point Astar object doesn't seem to trigger the error at all, the issue is only present when I loop through a large enough number of the points while call these functions, like this:
But looping through a lesser number of points works fine. (i.e. while t < 100, works)
Or if i decide to print something within the loop, no error is shown:
varpoint_data= {}
vart=0whilet<100000:
varpoint_index=tvarpos=astar_obj.get_point_position(point_index)
varconnections=astar_obj.get_point_connections(point_index)
point_data[point_index] = [pos,connections]
t+=1#this fixes the issue but also slows down the loop for the same reason as the errorprint([pos,connections])
Or if i remove the point_data dictionary's data collection from the loop, no error is shown. (i.e. removing this line: "point_data[point_index] = [pos,connections]")
(i also tried converting point_data to an array, it makes no difference)
I then tried converting the point_data to a string and collected the data by appending it to the string like so: point_data += var2str([pos,connections])+"," , this worked surprisingly but I can't attest to how efficient repeated calls to var2str() are.
So the issue isn't even the Astar object breaking entirely due to having a large number points but rather breaking in response to being looped through and its data being collected by an array or dictionary.
I'm also not sure what other Astar object functions might be subject to this issue.
is there a way to circumvent this issue without modifying godot?
You could write an A* implementation in GDScript that doesn't rely on Pool*Array for large amounts of items. That said, depending on the number of platforms you want to target, I think recompiling the editor and export templates with my PR included might be less work.
Godot version
3.3.4
System information
Windows 10, Intel i7-8850H
Issue description
If you decide to add a large enough number of points (my test is with around 100,000) to an Astar object then try to loop through all the points and call the object's get_point_position() or get_point_connections() functions you will receive this error every loop:
the error printing every loop slows this down to a snail's pace considering the number of points.
Further Experimentation:
(hopefully useful though not exactly "brief")
Strangely enough get_point_position() and get_point_connections() being called on this +100,000 point Astar object doesn't seem to trigger the error at all, the issue is only present when I loop through a large enough number of the points while call these functions, like this:
But looping through a lesser number of points works fine. (i.e.
while t < 100
, works)Or if i decide to print something within the loop, no error is shown:
Or if i remove the
point_data
dictionary's data collection from the loop, no error is shown. (i.e. removing this line: "point_data[point_index] = [pos,connections]
")(i also tried converting
point_data
to an array, it makes no difference)I then tried converting the
point_data
to a string and collected the data by appending it to the string like so:point_data += var2str([pos,connections])+","
, this worked surprisingly but I can't attest to how efficient repeated calls tovar2str()
are.So the issue isn't even the Astar object breaking entirely due to having a large number points but rather breaking in response to being looped through and its data being collected by an array or dictionary.
I'm also not sure what other Astar object functions might be subject to this issue.
Steps to reproduce
You can use this script to reproduce the error:
Minimal reproduction project
No response
The text was updated successfully, but these errors were encountered: