New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add to dict fails after 1,000,000 items on py 2.7.5 #63642
Comments
d, i = {}, 0 On Py 2.7.5 (windows7, x64, 4GB ram) this program slowed down obviously after passing 1,000,000 adds and never completed or raised an exception. |
Your test case, nor the one I wrote, trigger Python 2.7.5 nor 3.3.2 on Linux 64-bit: try: xrange d = {}
r = 10000000
ctr = 0
for n in xrange(r):
d[n] = n
ctr += 1 assert len(d) == r |
Works for me -- on 64-bit OS X 10.9 running Python 2.7.5 |
Just another data point: runs fine on Vista, 32-bit box, Python 2.7.5. Python is consuming about 320MB when the dict is done building. |
It works for me on Linux 64-bit: $ python
Python 2.7.3 (default, Aug 9 2012, 17:23:57)
[GCC 4.7.1 20120720 (Red Hat 4.7.1-5)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> d, i = {}, 0
>>> while (i < 10000000):
... n = i + 1
... d[n] = n
... i += 1
...
>>> import os
>>> os.system("grep VmRSS /proc/%s/status" % os.getpid())
VmRSS: 637984 kB
0 "On Py 2.7.5 (windows7, x64, 4GB ram) this program slowed down obviously after passing 1,000,000 adds and never completed or raised an exception." What is the exception? How much free memory do you have? If Python has not enough memory, Windows will probably starts to move memory to the disk and the system will becomes slower and slower. |
I followed the suggestion of email responders to use xrange instead of while, and observed that 32-bit Suse Linux got past 44,000,000 adds before exiting with "Memory Error", while 64-bit Windows 7 slowed down markedly after 22,000,000 adds and was unusable after 44,000,000 adds. However, the program did not stop or raise an exception, which is a concern. The size of the dict was 1.6 GB at that level. My current suspicion is that Windows is not doing a good job of pushing memory already allocated by the process to the virtual file system as the process continues to request more memory. But in my opinion Python should be able to detect failure to complete an allocation request on Windows, and raise an appropriate exception, as it does for Linux. |
Works for me: Python 2.7.5, 64-bit, Windows 8.1 |
Which failure? You're telling us it doesn't fail, it just becomes slow. |
I now believe the problem of slow execution is caused by performance of the Windows 7 page file, and not by a Python bug. Others reported that similar tests worked on Windows 8.1 and various Linux systems. So I request to close or withdraw the Python "bug". |
Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.
Show more details
GitHub fields:
bugs.python.org fields:
The text was updated successfully, but these errors were encountered: