Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory error of GrangerAnalyzer #123

Closed
dongqunxi opened this issue Apr 10, 2014 · 7 comments
Closed

Memory error of GrangerAnalyzer #123

dongqunxi opened this issue Apr 10, 2014 · 7 comments

Comments

@dongqunxi
Copy link

Dear all,
When I run a script like that:

>>>sampling_rate=1000
>>>freq_idx_G
Out[7]: array([40, 41])
>>>G.frequencies.shape[0]
Out[8]: 513
>>>g1 = np.mean(G.causality_xy[:, :, freq_idx_G], -1)

it met the following memory error (freq_idx_G=):

---------------------------------------------------------------------------
MemoryError                               Traceback (most recent call last)
<ipython-input-6-b3dd332ebe13> in <module>()
----> 1 g1 = np.mean(G.causality_xy[:, :, freq_idx_G], -1)

/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/descriptors.pyc in __get__(self, obj, type)
    138         # Errors in the following line are errors in setting a
    139         # OneTimeProperty
--> 140         val = self.getter(obj)
    141 
    142         setattr(obj, self.name, val)

/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/analysis/granger.pyc in causality_xy(self)
    202     @desc.setattr_on_read
    203     def causality_xy(self):
--> 204         return self._dict2arr('gc_xy')
    205 
    206     @desc.setattr_on_read

/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/analysis/granger.pyc in _dict2arr(self, key)
    191         arr = np.empty((self._n_process,
    192                         self._n_process,
--> 193                         self.frequencies.shape[0]))
    194 
    195         arr.fill(np.nan)

MemoryError: 

Can anyone give me some tips? Thanks!

@arokem
Copy link
Member

arokem commented Apr 10, 2014

Hmm. How many channels do you have in the data? How many time-points?

On Thu, Apr 10, 2014 at 9:00 AM, dongqunxi notifications@github.com wrote:

Dear all,
When I run a script like that:

sampling_rate=1000>>>freq_idx_GOut[7]: array([40, 41])>>>G.frequencies.shape[0]Out[8]: 513>>>g1 = np.mean(G.causality_xy[:, :, freq_idx_G], -1)

it met the following memory error (freq_idx_G=):

---------------------------------------------------------------------------MemoryError Traceback (most recent call last) in ()----> 1 g1 = np.mean(G.causality_xy[:, :, freq_idx_G], -1)
/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/descriptors.pyc in get(self, obj, type)
138 # Errors in the following line are errors in setting a
139 # OneTimeProperty--> 140 val = self.getter(obj)
141
142 setattr(obj, self.name, val)
/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/analysis/granger.pyc in causality_xy(self)
202 @desc.setattr_on_read
203 def causality_xy(self):--> 204 return self._dict2arr('gc_xy')
205
206 @desc.setattr_on_read
/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/analysis/granger.pyc in _dict2arr(self, key)
191 arr = np.empty((self._n_process,
192 self._n_process,--> 193 self.frequencies.shape[0]))
194
195 arr.fill(np.nan)
MemoryError:

Can anyone give me some tips? Thanks!


Reply to this email directly or view it on GitHubhttps://github.com//issues/123
.

@dongqunxi
Copy link
Author

three channels and 6000 time points each channel.

Best wishes,
Qunxi Dong

2014-04-11 1:10 GMT+02:00 Ariel Rokem notifications@github.com:

Hmm. How many channels do you have in the data? How many time-points?

On Thu, Apr 10, 2014 at 9:00 AM, dongqunxi notifications@github.com
wrote:

Dear all,
When I run a script like that:

sampling_rate=1000>>>freq_idx_GOut[7]: array([40,
41])>>>G.frequencies.shape[0]Out[8]: 513>>>g1 = np.mean(G.causality_xy[:,
:, freq_idx_G], -1)

it met the following memory error (freq_idx_G=):

---------------------------------------------------------------------------MemoryError
Traceback (most recent call last) in
()----> 1 g1 = np.mean(G.causality_xy[:, :, freq_idx_G], -1)

/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/descriptors.pyc
in get(self, obj, type)
138 # Errors in the following line are errors in setting a
139 # OneTimeProperty--> 140 val = self.getter(obj)
141
142 setattr(obj, self.name, val)

/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/analysis/granger.pyc
in causality_xy(self)
202 @desc.setattr_on_read
203 def causality_xy(self):--> 204 return self._dict2arr('gc_xy')
205
206 @desc.setattr_on_read

/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/analysis/granger.pyc
in _dict2arr(self, key)
191 arr = np.empty((self._n_process,
192 self._n_process,--> 193 self.frequencies.shape[0]))
194
195 arr.fill(np.nan)
MemoryError:

Can anyone give me some tips? Thanks!


Reply to this email directly or view it on GitHub<
https://github.com/nipy/nitime/issues/123>
.


Reply to this email directly or view it on GitHubhttps://github.com//issues/123#issuecomment-40153686
.

@arokem
Copy link
Member

arokem commented Apr 11, 2014

I can't replicate this error on my laptop. It might be that your machine
just doesn't have enough memory. How much RAM are you working with here?
Would you mind sharing those time-series with me? I can try it with your
actual data. I had to generate some random data to try out the data-size.

On Fri, Apr 11, 2014 at 1:06 AM, dongqunxi notifications@github.com wrote:

three channels and 6000 time points each channel.

Best wishes,
Qunxi Dong

2014-04-11 1:10 GMT+02:00 Ariel Rokem notifications@github.com:

Hmm. How many channels do you have in the data? How many time-points?

On Thu, Apr 10, 2014 at 9:00 AM, dongqunxi notifications@github.com
wrote:

Dear all,
When I run a script like that:

sampling_rate=1000>>>freq_idx_GOut[7]: array([40,
41])>>>G.frequencies.shape[0]Out[8]: 513>>>g1 =
np.mean(G.causality_xy[:,
:, freq_idx_G], -1)

it met the following memory error (freq_idx_G=):

---------------------------------------------------------------------------MemoryError

Traceback (most recent call last) in
()----> 1 g1 = np.mean(G.causality_xy[:, :, freq_idx_G], -1)

/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/descriptors.pyc

in get(self, obj, type)

138 # Errors in the following line are errors in setting a
139 # OneTimeProperty--> 140 val = self.getter(obj)
141
142 setattr(obj, self.name, val)

/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/analysis/granger.pyc

in causality_xy(self)

202 @desc.setattr_on_read
203 def causality_xy(self):--> 204 return self._dict2arr('gc_xy')
205
206 @desc.setattr_on_read

/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/analysis/granger.pyc

in _dict2arr(self, key)

191 arr = np.empty((self._n_process,
192 self._n_process,--> 193 self.frequencies.shape[0]))
194
195 arr.fill(np.nan)
MemoryError:

Can anyone give me some tips? Thanks!


Reply to this email directly or view it on GitHub<
https://github.com/nipy/nitime/issues/123>
.


Reply to this email directly or view it on GitHub<
https://github.com/nipy/nitime/issues/123#issuecomment-40153686>
.


Reply to this email directly or view it on GitHubhttps://github.com//issues/123#issuecomment-40179835
.

@dongqunxi
Copy link
Author

https://dl.dropboxusercontent.com/u/94877880/causality%20challenge%20biomag%202014/BioMag2014-Causality-Challenge.htm
My Laptop has 8G memory. I think it is enough. About the time series I got
from the above link. Maybe you can try it. Thanks

Best wishes,
Qunxi Dong

2014-04-11 18:14 GMT+02:00 Ariel Rokem notifications@github.com:

I can't replicate this error on my laptop. It might be that your machine
just doesn't have enough memory. How much RAM are you working with here?
Would you mind sharing those time-series with me? I can try it with your
actual data. I had to generate some random data to try out the data-size.

On Fri, Apr 11, 2014 at 1:06 AM, dongqunxi notifications@github.com
wrote:

three channels and 6000 time points each channel.

Best wishes,
Qunxi Dong

2014-04-11 1:10 GMT+02:00 Ariel Rokem notifications@github.com:

Hmm. How many channels do you have in the data? How many time-points?

On Thu, Apr 10, 2014 at 9:00 AM, dongqunxi notifications@github.com
wrote:

Dear all,
When I run a script like that:

sampling_rate=1000>>>freq_idx_GOut[7]: array([40,
41])>>>G.frequencies.shape[0]Out[8]: 513>>>g1 =
np.mean(G.causality_xy[:,
:, freq_idx_G], -1)

it met the following memory error (freq_idx_G=):

---------------------------------------------------------------------------MemoryError

Traceback (most recent call last) in
()----> 1 g1 = np.mean(G.causality_xy[:, :, freq_idx_G], -1)

/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/descriptors.pyc

in get(self, obj, type)

138 # Errors in the following line are errors in setting a
139 # OneTimeProperty--> 140 val = self.getter(obj)
141
142 setattr(obj, self.name, val)

/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/analysis/granger.pyc

in causality_xy(self)

202 @desc.setattr_on_read
203 def causality_xy(self):--> 204 return self._dict2arr('gc_xy')
205
206 @desc.setattr_on_read

/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/analysis/granger.pyc

in _dict2arr(self, key)

191 arr = np.empty((self._n_process,
192 self._n_process,--> 193 self.frequencies.shape[0]))
194
195 arr.fill(np.nan)
MemoryError:

Can anyone give me some tips? Thanks!


Reply to this email directly or view it on GitHub<
https://github.com/nipy/nitime/issues/123>
.


Reply to this email directly or view it on GitHub<
https://github.com/nipy/nitime/issues/123#issuecomment-40153686>
.


Reply to this email directly or view it on GitHub<
https://github.com/nipy/nitime/issues/123#issuecomment-40179835>
.


Reply to this email directly or view it on GitHubhttps://github.com//issues/123#issuecomment-40221384
.

@arokem
Copy link
Member

arokem commented Apr 14, 2014

I don't know how to read that data. Could you please save the time-series
in question as an array in an .npy file and share that?

On Sun, Apr 13, 2014 at 8:18 AM, dongqunxi notifications@github.com wrote:

https://dl.dropboxusercontent.com/u/94877880/causality%20challenge%20biomag%202014/BioMag2014-Causality-Challenge.htm
My Laptop has 8G memory. I think it is enough. About the time series I got
from the above link. Maybe you can try it. Thanks

Best wishes,
Qunxi Dong

2014-04-11 18:14 GMT+02:00 Ariel Rokem notifications@github.com:

I can't replicate this error on my laptop. It might be that your machine
just doesn't have enough memory. How much RAM are you working with here?
Would you mind sharing those time-series with me? I can try it with your
actual data. I had to generate some random data to try out the
data-size.

On Fri, Apr 11, 2014 at 1:06 AM, dongqunxi notifications@github.com
wrote:

three channels and 6000 time points each channel.

Best wishes,
Qunxi Dong

2014-04-11 1:10 GMT+02:00 Ariel Rokem notifications@github.com:

Hmm. How many channels do you have in the data? How many
time-points?

On Thu, Apr 10, 2014 at 9:00 AM, dongqunxi notifications@github.com

wrote:

Dear all,
When I run a script like that:

sampling_rate=1000>>>freq_idx_GOut[7]: array([40,
41])>>>G.frequencies.shape[0]Out[8]: 513>>>g1 =
np.mean(G.causality_xy[:,
:, freq_idx_G], -1)

it met the following memory error (freq_idx_G=):

---------------------------------------------------------------------------MemoryError

Traceback (most recent call last) in
()----> 1 g1 = np.mean(G.causality_xy[:, :, freq_idx_G], -1)

/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/descriptors.pyc

in get(self, obj, type)

138 # Errors in the following line are errors in setting a
139 # OneTimeProperty--> 140 val = self.getter(obj)
141
142 setattr(obj, self.name, val)

/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/analysis/granger.pyc

in causality_xy(self)

202 @desc.setattr_on_read
203 def causality_xy(self):--> 204 return self._dict2arr('gc_xy')
205
206 @desc.setattr_on_read

/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/analysis/granger.pyc

in _dict2arr(self, key)

191 arr = np.empty((self._n_process,
192 self._n_process,--> 193 self.frequencies.shape[0]))
194
195 arr.fill(np.nan)
MemoryError:

Can anyone give me some tips? Thanks!


Reply to this email directly or view it on GitHub<
https://github.com/nipy/nitime/issues/123>
.


Reply to this email directly or view it on GitHub<
https://github.com/nipy/nitime/issues/123#issuecomment-40153686>
.


Reply to this email directly or view it on GitHub<
https://github.com/nipy/nitime/issues/123#issuecomment-40179835>
.


Reply to this email directly or view it on GitHub<
https://github.com/nipy/nitime/issues/123#issuecomment-40221384>
.


Reply to this email directly or view it on GitHubhttps://github.com//issues/123#issuecomment-40310024
.

@dongqunxi
Copy link
Author

@arokem, I have found the reason of this error.
If the array is a shape like (6000, 2), it meets the memory error.
When I transpose it as (2, 6000), it works well. Thanks for your attention and patience.

@arokem
Copy link
Member

arokem commented Apr 15, 2014

Yes - that makes sense. Time should always be on the last dimension. Happy to hear that it's now working for you

@arokem arokem closed this as completed Apr 15, 2014
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants