You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm confused by the huge runtime memory footprint when using MediaGenerator3d to create particle distributions given seedPositions. When the number of particles is 500,000, the running memory usage will reach an astonishing 110GB, and the memory will not be automatically released after the particle distribution is created. I tried to optimize but failed, here is that part of the program:
generator2= MedialGenerator3d(n = GivenNumber,
rho = rho02,
boundary = boundaryShape2, # the boundary of generator
centroidFrac = 0.1, # limit of distribute about joint of surfaces
maxIterations = 1,
fracTol = 1.0e-3,
seedPositions=GivenPositions,
nNodePerh = nPerh)
distribute.append((nodes2, generator2))
The text was updated successfully, but these errors were encountered:
Dear spheral,
I'm confused by the huge runtime memory footprint when using MediaGenerator3d to create particle distributions given seedPositions. When the number of particles is 500,000, the running memory usage will reach an astonishing 110GB, and the memory will not be automatically released after the particle distribution is created. I tried to optimize but failed, here is that part of the program:
generator2= MedialGenerator3d(n = GivenNumber,
rho = rho02,
boundary = boundaryShape2, # the boundary of generator
centroidFrac = 0.1, # limit of distribute about joint of surfaces
maxIterations = 1,
fracTol = 1.0e-3,
seedPositions=GivenPositions,
nNodePerh = nPerh)
distribute.append((nodes2, generator2))
The text was updated successfully, but these errors were encountered: