You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
from numpy import *
from scipy import interpolate as irp
w = 61
t = linspace(-1,1,w)
xx,yy = meshgrid(t,t)
xx,yy = xx.flatten(), yy.flatten() # points for evaluation
P = c_[sign(xx)*abs(xx)**0.65, sign(yy)*abs(yy)**0.65] # points for interpolation
del t
x = random.randint( 0, 255, (2*w**2,w**2) ).astype(uint8) # fake interpolation values - each row corresponds to one interpolation
xo = zeros(x.shape,dtype=uint8) # for storing results
for i in range(x.shape[0]): # memory accumulates here I think
f = irp.LinearNDInterpolator( P, x[i,:] )
out = f( xx, yy )
xo[i,:] = out
del f, out
on a intel core 2 duo macbook OS X 10.6 with 2GB RAM. The memory requirements to perform this interpolation a single time are not large (~100MB), yet when I perform this loop it fails. Python slowly acquires all my memory as the loop progresses. It doesn't make sense that an operation that only needs something on the order of MB grabs more than a GB and then errors because it can't have more. I receive this error message:
Python(10783) malloc: *** mmap(size=16777216) failed (error code=12) *** error: can't allocate region *** set a breakpoint in malloc_error_break to debug Traceback (most recent call last): File "make_lens.py", line 16, in <module> f = irp.LinearNDInterpolator( P, x[i,:] ) File "interpnd.pyx", line 204, in scipy.interpolate.interpnd.LinearNDInterpolator.__init__ (scipy/interpolate/interpnd.c:3794) File "qhull.pyx", line 1703, in scipy.spatial.qhull.Delaunay.__init__ (scipy/spatial/qhull.c:13267) File "qhull.pyx", line 1432, in scipy.spatial.qhull._QhullUser.__init__ (scipy/spatial/qhull.c:11989) File "qhull.pyx", line 1712, in scipy.spatial.qhull.Delaunay._update (scipy/spatial/qhull.c:13470) File "qhull.pyx", line 526, in scipy.spatial.qhull._Qhull.get_simplex_facet_array (scipy/spatial/qhull.c:5453) File "qhull.pyx", line 594, in scipy.spatial.qhull._Qhull._get_simplex_facet_array (scipy/spatial/qhull.c:6010) MemoryError
I tried this with CloughTocher2DInterpolator as well and received a similar error:
Python(10974) malloc: *** mmap(size=16777216) failed (error code=12)
*** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug
QH6080 qhull error (qh_memalloc): insufficient memory to allocate short memory buffer (65536 bytes)
While executing: | qhull d Qbb Qz Qc Qt
Options selected for Qhull 2012.1 2012/02/18:
run-id 1859014481 delaunay Qbbound-last Qz-infinity-point Qcoplanar-keep
Qtriangulate _pre-merge _zero-centrum Qinterior-keep Pgood _max-width 2
Error-roundoff 2.8e-15 _one-merge 1.9e-14 Visible-distance 5.5e-15
U-coplanar-distance 5.5e-15 Width-outside 1.1e-14 _wide-facet 3.3e-14
Last point added to hull was p3457. Last merge was #4395.
Qhull has finished constructing the hull.
At error exit:
Delaunay triangulation by the convex hull of 3722 points in 3-d:
Number of input sites and at-infinity: 3722
Number of Delaunay regions: 0
Number of non-simplicial Delaunay regions: 3604
Statistics for: | qhull d Qbb Qz Qc Qt
Number of points processed: 3722
Number of hyperplanes created: 14131
Number of facets in hull: 13798
Number of distance tests for qhull: 128936
Number of distance tests for merging: 134142
Number of distance tests for checking: 60878
Number of merged facets: 4395
CPU seconds to compute hull (after input): 0.04895
precision problems (corrected unless 'Q0' or an error)
4395 coplanar horizon facets for new vertices
Segmentation fault
Why is memory not being released? Is qhull leaky?
The text was updated successfully, but these errors were encountered:
I am executing something like the following:
on a intel core 2 duo macbook OS X 10.6 with 2GB RAM. The memory requirements to perform this interpolation a single time are not large (~100MB), yet when I perform this loop it fails. Python slowly acquires all my memory as the loop progresses. It doesn't make sense that an operation that only needs something on the order of MB grabs more than a GB and then errors because it can't have more. I receive this error message:
Python(10783) malloc: *** mmap(size=16777216) failed (error code=12) *** error: can't allocate region *** set a breakpoint in malloc_error_break to debug Traceback (most recent call last): File "make_lens.py", line 16, in <module> f = irp.LinearNDInterpolator( P, x[i,:] ) File "interpnd.pyx", line 204, in scipy.interpolate.interpnd.LinearNDInterpolator.__init__ (scipy/interpolate/interpnd.c:3794) File "qhull.pyx", line 1703, in scipy.spatial.qhull.Delaunay.__init__ (scipy/spatial/qhull.c:13267) File "qhull.pyx", line 1432, in scipy.spatial.qhull._QhullUser.__init__ (scipy/spatial/qhull.c:11989) File "qhull.pyx", line 1712, in scipy.spatial.qhull.Delaunay._update (scipy/spatial/qhull.c:13470) File "qhull.pyx", line 526, in scipy.spatial.qhull._Qhull.get_simplex_facet_array (scipy/spatial/qhull.c:5453) File "qhull.pyx", line 594, in scipy.spatial.qhull._Qhull._get_simplex_facet_array (scipy/spatial/qhull.c:6010) MemoryError
I tried this with
CloughTocher2DInterpolator
as well and received a similar error:Why is memory not being released? Is qhull leaky?
The text was updated successfully, but these errors were encountered: