-
-
Notifications
You must be signed in to change notification settings - Fork 31.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
zipfile: Allow reading duplicate filenames #45317
Comments
Allow open() 'name' parameter to be a ZipInfo object, which allows opening archive members with duplicate filenames. Also allow read() 'name' parameter to be a ZipInfo object, as it calls open() directly. I got sent a zip file which had duplicate names in it, and the only way I could see to extract it using zipfile.py was to apply this patch. The infolist() and namelist() methods will return information for duplicate filenames, but the open() method takes only a name. This patch also updated the docs for zipfile.py. Python 2.1 -> 2.5 zipfile.py module does not have an open() method, but it would be trivial to backport this patch to enhance the read() method. # Test:
# write() optionally warns, but still allows,
# creating duplicate file names:
import zipfile
zf = zipfile.ZipFile('dupzip.zip', 'w')
zf.debug = 1
zf.writestr('dupname', 'Hello')
zf.writestr('dupname', 'World')
zf.close()
# Print 'Hello' 'World'
zfr = zipfile.ZipFile('dupzip.zip', 'r')
for inf in zfr.infolist():
print repr(zfr.read(inf))
zfr.close() |
In the patch you commented "why is 'filepos' computed next? It's never |
Updated to latest revision, and converted documentation part of the Removed the line that pointlessly computes 'filepos', as requested by (Please excuse my reST, I'm new to it and it's getting late). |
Thanks, reviewed, added tests and committed as r63499. |
Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.
Show more details
GitHub fields:
bugs.python.org fields:
The text was updated successfully, but these errors were encountered: