-
-
Notifications
You must be signed in to change notification settings - Fork 9.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use FFT in np.correlate/convolve? (Trac #1260) #1858
Comments
@josef-pkt wrote on 2009-10-13 I don't know what xcorr does, but the full convolution in standard matlab (2009b):
also takes forever (around or close to an hour). np.correlate(x,x) takes no time since it only calculates one value. np.correlate(x,x,'full') is also taking around an hour. Maybe a truncated convolution/correlation would be good. |
@pv wrote on 2009-10-15
One thing is that The remaining question then is whether we want to use FFT also in Numpy for these functions. |
Milestone changed to |
Title changed from |
@josef-pkt wrote on 2009-10-16 I'm using np.correlate for time series analysis where the number of observations is more in the range of a few hundred to 10000 (in finance), and also with integers (for bias correction.) Is there a disadvantage in using the fft for this? (xcorr in matlab signal toolbox seems to use fft.) |
@pv wrote on 2009-10-16 I don't think there is any significant disadvantage in using FFT. It might be a bit slower for small N than explicit convolution, and might for intermediate N introduce more numerical error (we're talking maybe about 1e-11 or so). IIRC, for large N FFT should be better both for the numerical error and performance. I don't believe you can perform FFT convolution with integers using the tools currently in Numpy -- cast to complex is probably always needed. (An interesting exercise might be to write an explicit divide-and-conquer version of convolution, it's possible to do since it can be done via FFT works :) |
@endolith wrote on 2009-11-20
|
@cournape wrote on 2009-11-25 FFT has several disadvantages: it does not work for object arrays, can be memory hungry, and slower when only a few values of the correlation is needed. I think we should keep the current implementation based on naive implementation, and add a FFT-based one. The current function could take an argument to force naive, fft or heuristic (to try to pick the fastest one). |
Milestone changed to |
@endolith wrote on 2009-12-01 Also, in http://projects.scipy.org/scipy/changeset/5968, shouldn't
have been removed? |
@endolith wrote on 2011-02-18 Replying to [comment:3 pv]:
Shouldn't that be optimized in the fft function itself, instead of modifying all the functions that use it? It could be optimized to use the real FFT when appropriate, too. Replying to [comment:8 cdavid]:
This is what I was asking for. Can object arrays be automatically handled by the naive implementation? |
Original ticket http://projects.scipy.org/numpy/ticket/1260 on 2009-10-12 by trac user roger, assigned to unknown.
The
convolve
andcorrelate
functions appear to be much slower than their MATLAB 2009a equivalents.The MATLAB command
xcorr(randn(1e6,1))
takes about 0.35s to execute, while the Python equivalentx = randn(1e6);correlate(x, x)
takes more than a minute (then killed). MATLAB is also much faster for arrays of 1e5 elements.fftconvolve
inscipy.signal
is even slower.Tested with Python 2.6.3 and numpy 1.30 (x86) under x64 Windows 7 and x64 Ubuntu on an i7.
The text was updated successfully, but these errors were encountered: