New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory leak even when cache_size = 0 and history_length = 0 or history_length = 1 #3452
Comments
Looking into it now. Anyone else testing, you might want to set nrows down to 1e6 to avoid hitting swap. |
It is something related to the output cache - running |
Ah, the last three outputs are cached as |
i like completely disabled because that's what i'm asking for when i say |
Do you want to have a look at |
where are the relevant tests? |
ah i see them interactiveshelltestcase |
do you guys use travis or some other CI that I can hook into? |
@cpcloud yes, if you make a PR it will place it in our Travis queue |
ok cool thanks |
is there a way to set config variables at runtime? |
In [1]: get_ipython().config
Out[1]:
{'InteractiveShellApp': {'extensions': ['storemagic',
'memory_profiler',
'django_notebook']},
'ProfileDir': {},
'TerminalInteractiveShell': {'banner1': '', 'colors': 'LightBG'}}
## and you can set them just by just accessing new keys as though they are attributes
In [2]: get_ipython().config.TerminalInteractiveShell.foo = 1
In [3]: get_ipython().config
Out[3]:
{'InteractiveShellApp': {'extensions': ['storemagic',
'memory_profiler',
'django_notebook']},
'ProfileDir': {},
'TerminalInteractiveShell': {'banner1': '', 'colors': 'LightBG', 'foo': 1}} |
There's also a %config magic, which should work like this:
|
@takluyver and @cpcloud any progress on this one. I am going to bump it to 3.0. If you plan on getting to this in the next few days, feel free to bump it back to 2.0. |
I've just stumbled across this issue. Even with
Jupyter notebook will not free objects referenced in cells. For instance, when writing
where ( This is not a problem when working with small objects, but it does get rather annoying in interactive sessions that involve big data on a resource-constrained device (GPU memory..) |
It seems that import IPython
IPython.get_ipython().displayhook.cache_size = 0 |
@mhsekhavat: that setting is a kernel or standalone interactive console option and does not apply to to notebooks, so it's not surprising that For @wjakob:
That specific setting works only when using
|
None of the proposed solutions seem to do anything for me. I have created a reproduction notebook: https://colab.research.google.com/drive/1UpqpMbb6fpCZFDXNZ-Q5i72aAqn8R2cI?usp=sharing |
i'm using git master ipython, python 2.7.5, arch linux 64-bit
using the following code as a starting point
if you now repeatedly evaluate
big.values
(by hand not in a loop) then memory keeps on growing (openhtop
ortop
to watch it in action). same is true when i've set the output cache to zero and the history length to 0 or 1.furthermore this doesn't happen in vanilla python which has just
_
for history and if i e.g., evaluatebig.values
n times then i need to execute n other statements e.g,.x = 1
n times to reclaim the memory.see issue at pandas-dev/pandas#3629 for a long discussion about this. am i missing some feature/quirk of the history or caching system?
The text was updated successfully, but these errors were encountered: