-
Notifications
You must be signed in to change notification settings - Fork 25.6k
[DataLoader] Close byte stream explicitly #58938
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
[ghstack-poisoned]
💊 CI failures summary and remediationsAs of commit 2ad6a79 (more details on the Dr. CI page):
ci.pytorch.org: 1 failedThis comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions to the (internal) Dr. CI Users group. |
When run `test_datapipe.py`, python `gc` would report lots of `ResourceWarning`s due to unclosed stream. It's not only annoying, there are two potential problems: - Performance regression because `gc` requires additional memory and computation to track reference - Python `gc` runs periodically so we many encountered an error of too many open files due to OS limitation To reduce the warning: - Explicitly close byte stream - Modify `test_datapipe.py` to use context manager Small fix: - Reorder import in `test_datapipe.py` Further investigation: Can we directly use context manager in `LoadFileFromDisk` and `ReadFileFromTar` to eliminate this Error? - Probably no. It's feasible only if the pipeline is synchronized and without prefetching. When we enable these two features, the scope guard of the context manager doesn't work. - We may need to implement some reference counter attached to these file byte stream to close by itself. [ghstack-poisoned]
When run `test_datapipe.py`, python `gc` would report lots of `ResourceWarning`s due to unclosed stream. It's not only annoying, there are two potential problems: - Performance regression because `gc` requires additional memory and computation to track reference - Python `gc` runs periodically so we many encountered an error of too many open files due to OS limitation To reduce the warning: - Explicitly close byte stream - Modify `test_datapipe.py` to use context manager Small fix: - Reorder import in `test_datapipe.py` Further investigation: Can we directly use context manager in `LoadFileFromDisk` and `ReadFileFromTar` to eliminate this Error? - Probably no. It's feasible only if the pipeline is synchronized and without prefetching. When we enable these two features, the scope guard of the context manager doesn't work. - We may need to implement some reference counter attached to these file byte stream to close by itself. [ghstack-poisoned]
@ejguan has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
When run `test_datapipe.py`, python `gc` would report lots of `ResourceWarning`s due to unclosed stream. It's not only annoying, there are two potential problems: - Performance regression because `gc` requires additional memory and computation to track reference - Python `gc` runs periodically so we many encountered an error of too many open files due to OS limitation To reduce the warning: - Explicitly close byte stream - Modify `test_datapipe.py` to use context manager Small fix: - Reorder import in `test_datapipe.py` Further investigation: Can we directly use context manager in `LoadFileFromDisk` and `ReadFileFromTar` to eliminate this Error? - Probably no. It's feasible only if the pipeline is synchronized and without prefetching. When we enable these two features, the scope guard of the context manager doesn't work. - We may need to implement some reference counter attached to these file byte stream to close by itself. Differential Revision: [D28689862](https://our.internmc.facebook.com/intern/diff/D28689862) [ghstack-poisoned]
When run `test_datapipe.py`, python `gc` would report lots of `ResourceWarning`s due to unclosed stream. It's not only annoying, there are two potential problems: - Performance regression because `gc` requires additional memory and computation to track reference - Python `gc` runs periodically so we many encountered an error of too many open files due to OS limitation To reduce the warning: - Explicitly close byte stream - Modify `test_datapipe.py` to use context manager Small fix: - Reorder import in `test_datapipe.py` Further investigation: Can we directly use context manager in `LoadFileFromDisk` and `ReadFileFromTar` to eliminate this Error? - Probably no. It's feasible only if the pipeline is synchronized and without prefetching. When we enable these two features, the scope guard of the context manager doesn't work. - We may need to implement some reference counter attached to these file byte stream to close by itself. Differential Revision: [D28689862](https://our.internmc.facebook.com/intern/diff/D28689862) [ghstack-poisoned]
@ejguan has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
When run `test_datapipe.py`, python `gc` would report lots of `ResourceWarning`s due to unclosed stream. It's not only annoying, there are two potential problems: - Performance regression because `gc` requires additional memory and computation to track reference - Python `gc` runs periodically so we many encountered an error of too many open files due to OS limitation To reduce the warning: - Explicitly close byte stream - Modify `test_datapipe.py` to use context manager Small fix: - Reorder import in `test_datapipe.py` Further investigation: Can we directly use context manager in `LoadFileFromDisk` and `ReadFileFromTar` to eliminate this Error? - Probably no. It's feasible only if the pipeline is synchronized and without prefetching. When we enable these two features, the scope guard of the context manager doesn't work. - We may need to implement some reference counter attached to these file byte stream to close by itself. Differential Revision: [D28689862](https://our.internmc.facebook.com/intern/diff/D28689862) [ghstack-poisoned]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we have a test that asserts that a file handle is closed after it has been processed by the decoder? Otherwise LGTM, thanks @ejguan!
On a side note: we should document somewhere that it is the users responsibility to close the opened file handles if he doesn't use the builtin decoder. |
When run `test_datapipe.py`, python `gc` would report lots of `ResourceWarning`s due to unclosed stream. It's not only annoying, there are two potential problems: - Performance regression because `gc` requires additional memory and computation to track reference - Python `gc` runs periodically so we many encountered an error of too many open files due to OS limitation To reduce the warning: - Explicitly close byte stream - Modify `test_datapipe.py` to use context manager Small fix: - Reorder import in `test_datapipe.py` Further investigation: Can we directly use context manager in `LoadFileFromDisk` and `ReadFileFromTar` to eliminate this Error? - Probably no. It's feasible only if the pipeline is synchronized and without prefetching. When we enable these two features, the scope guard of the context manager doesn't work. - We may need to implement some reference counter attached to these file byte stream to close by itself. Differential Revision: [D28689862](https://our.internmc.facebook.com/intern/diff/D28689862) [ghstack-poisoned]
When run `test_datapipe.py`, python `gc` would report lots of `ResourceWarning`s due to unclosed stream. It's not only annoying, there are two potential problems: - Performance regression because `gc` requires additional memory and computation to track reference - Python `gc` runs periodically so we many encountered an error of too many open files due to OS limitation To reduce the warning: - Explicitly close byte stream - Modify `test_datapipe.py` to use context manager Small fix: - Reorder import in `test_datapipe.py` Further investigation: Can we directly use context manager in `LoadFileFromDisk` and `ReadFileFromTar` to eliminate this Error? - Probably no. It's feasible only if the pipeline is synchronized and without prefetching. When we enable these two features, the scope guard of the context manager doesn't work. - We may need to implement some reference counter attached to these file byte stream to close by itself. Differential Revision: [D28689862](https://our.internmc.facebook.com/intern/diff/D28689862) [ghstack-poisoned]
When run `test_datapipe.py`, python `gc` would report lots of `ResourceWarning`s due to unclosed stream. It's not only annoying, there are two potential problems: - Performance regression because `gc` requires additional memory and computation to track reference - Python `gc` runs periodically so we many encountered an error of too many open files due to OS limitation To reduce the warning: - Explicitly close byte stream - Modify `test_datapipe.py` to use context manager Small fix: - Reorder import in `test_datapipe.py` Further investigation: Can we directly use context manager in `LoadFileFromDisk` and `ReadFileFromTar` to eliminate this Error? - Probably no. It's feasible only if the pipeline is synchronized and without prefetching. When we enable these two features, the scope guard of the context manager doesn't work. - We may need to implement some reference counter attached to these file byte stream to close by itself. Differential Revision: [D28689862](https://our.internmc.facebook.com/intern/diff/D28689862) [ghstack-poisoned]
@ejguan has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Summary: Pull Request resolved: pytorch#58938 When run `test_datapipe.py`, python `gc` would report lots of `ResourceWarning`s due to unclosed stream. It's not only annoying, there are two potential problems: - Performance regression because `gc` requires additional memory and computation to track reference - Python `gc` runs periodically so we many encountered an error of too many open files due to OS limitation To reduce the warning: - Explicitly close byte stream - Modify `test_datapipe.py` to use context manager Small fix: - Reorder import in `test_datapipe.py` Further investigation: Can we directly use context manager in `LoadFileFromDisk` and `ReadFileFromTar` to eliminate this Error? - Probably no. It's feasible only if the pipeline is synchronized and without prefetching. When we enable these two features, the scope guard of the context manager doesn't work. - We may need to implement some reference counter attached to these file byte stream to close by itself. Test Plan: Imported from OSS Reviewed By: jbschlosser Differential Revision: D28689862 Pulled By: ejguan fbshipit-source-id: bb2a85defb8a4ab5384db902ef6ad062185c2653
Stack from ghstack:
When run
test_datapipe.py
, pythongc
would report lots ofResourceWarning
s due to unclosed stream. It's not only annoying, there are two potential problems:gc
requires additional memory and computation to track referencegc
runs periodically so we many encountered an error of too many open files due to OS limitationTo reduce the warning:
test_datapipe.py
to use context managerSmall fix:
test_datapipe.py
Further investigation:
Can we directly use context manager in
LoadFileFromDisk
andReadFileFromTar
to eliminate this Error?Differential Revision: D28689862