Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Storage from_file method #1821

Merged
merged 7 commits into from
Jun 16, 2017
Merged

Conversation

vlasenkov
Copy link
Contributor

Resolves #884

return NULL;
}
#if defined(__APPLE__)
shared_mem = 0;

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

const char *filename;
Py_ssize_t size = 0;
int shared = 0, shared_mem = 0;
static char *kwlist[] = {"filename", "shared", "size", "shared_mem", NULL};

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

Py_ssize_t size = 0;
int shared = 0, shared_mem = 0;
static char *kwlist[] = {"filename", "shared", "size", "shared_mem", NULL};
if (!PyArg_ParseTupleAndKeywords(args, keywds, "s|pnp", kwlist,

This comment was marked as off-topic.

def check(filename, share_mem):
s1 = torch.FloatStorage.from_file(filename, True, size, share_mem)
t1 = torch.FloatTensor(s1).copy_(torch.randn(size))
t1_copy = t1.new().resize_as_(t1).copy_(t1)

This comment was marked as off-topic.

s2 = torch.FloatStorage.from_file(filename, True, size)
t2 = torch.FloatTensor(s2)
t2_copy = t2.new().resize_as_(t2).copy_(t2)
self.assertEqual(t1_copy, t2_copy, 0)

This comment was marked as off-topic.

self.assertEqual(t1, t2, 0)

if sys.startswith('win'):
os.remove(filename)

This comment was marked as off-topic.

def test_from_file(self):
size = 10000
filename = 'testPytorchStorageFromFile'
sys = platform.system()

This comment was marked as off-topic.


# check shm_open + open
if sys != 'Darwin': # OS X has no /dev/shm
check(filename, True)

This comment was marked as off-topic.

t1 = torch.FloatTensor(s1).copy_(torch.randn(size))
t1_copy = t1.new().resize_as_(t1).copy_(t1)
if share_mem and sys == 'Linux':
filename = os.path.join('/dev/shm', filename)

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

@apaszke
Copy link
Contributor

apaszke commented Jun 16, 2017

Actually now I think it should use libshm even now. Otherwise the unlinking semantics are going to change, and later commits will be breaking.

@vlasenkov
Copy link
Contributor Author

So, I'm removing shared_mem for now to save at least ordinary mmap functionality

@apaszke
Copy link
Contributor

apaszke commented Jun 16, 2017

Sounds good

@vlasenkov
Copy link
Contributor Author

where should be Storage docs stored? In the _torch_docs.py, or some new _storage_docs.py?

@apaszke
Copy link
Contributor

apaszke commented Jun 16, 2017

I think you need to create _storage_docs.py. We only document functions implemented in Python at the moment

@vlasenkov
Copy link
Contributor Author

@apaszke, I've added docs. ready for review/merge

@apaszke apaszke merged commit 3cecdf8 into pytorch:master Jun 16, 2017
@apaszke
Copy link
Contributor

apaszke commented Jun 16, 2017

Thank you!

@vlasenkov vlasenkov deleted the storage-from-file branch July 16, 2017 13:30
houseroad added a commit to houseroad/pytorch that referenced this pull request Feb 19, 2019
…2b732d (pytorch#17264)

Summary:
Pull Request resolved: pytorch#17264

Previous import was 822d8df0a2a32233c6022f50a158817a0f19bdc7

Included changes:
- **[4c091e0](onnx/onnx@4c091e0)**: Support defined ONNX_ML in parent cmake files (pytorch#1821) <Lu Fang>
- **[57372f3](onnx/onnx@57372f3)**: Delete OpsetVersionConverter.md which is a duplicate of VersionConverter.md (pytorch#1818) <Prasanth Pulavarthi>
- **[ab1c57e](onnx/onnx@ab1c57e)**: [ONNXIFI]Add extension to be implementable (pytorch#1796) <Rui Zhu>
- **[b92eee8](onnx/onnx@b92eee8)**: Revert "Implement Op Annotation's for ONNX (pytorch#1648)" (pytorch#1812) <Ke Zhang>
- **[61f1e9e](onnx/onnx@61f1e9e)**: Enable ONNX_ML by default (pytorch#1810) <Shinichiro Hamaji>
- **[4f064a1](onnx/onnx@4f064a1)**: fix Greater and Less doc (pytorch#1811) <Guoliang Hua>
- **[0628582](onnx/onnx@0628582)**: Implement Op Annotation's for ONNX (pytorch#1648) <Armen>
- **[ad9d2f7](onnx/onnx@ad9d2f7)**: Versioning doc update for Opset 9 (pytorch#1805) <Vinitra Swamy>
- **[e71e3be](onnx/onnx@e71e3be)**: add dilation case for ConvTranspose op (pytorch#1797) <Randy>

Differential Revision: D14135024

fbshipit-source-id: 9ee7d39a5efea5d9b3c12ac3e6acc32ae83c1d0e
ezyang pushed a commit to ezyang/pytorch that referenced this pull request Feb 19, 2019
…2b732d (pytorch#17264)

Summary:
Pull Request resolved: pytorch#17264

Previous import was 822d8df0a2a32233c6022f50a158817a0f19bdc7

Included changes:
- **[4c091e0](onnx/onnx@4c091e0)**: Support defined ONNX_ML in parent cmake files (pytorch#1821) <Lu Fang>
- **[57372f3](onnx/onnx@57372f3)**: Delete OpsetVersionConverter.md which is a duplicate of VersionConverter.md (pytorch#1818) <Prasanth Pulavarthi>
- **[ab1c57e](onnx/onnx@ab1c57e)**: [ONNXIFI]Add extension to be implementable (pytorch#1796) <Rui Zhu>
- **[b92eee8](onnx/onnx@b92eee8)**: Revert "Implement Op Annotation's for ONNX (pytorch#1648)" (pytorch#1812) <Ke Zhang>
- **[61f1e9e](onnx/onnx@61f1e9e)**: Enable ONNX_ML by default (pytorch#1810) <Shinichiro Hamaji>
- **[4f064a1](onnx/onnx@4f064a1)**: fix Greater and Less doc (pytorch#1811) <Guoliang Hua>
- **[0628582](onnx/onnx@0628582)**: Implement Op Annotation's for ONNX (pytorch#1648) <Armen>
- **[ad9d2f7](onnx/onnx@ad9d2f7)**: Versioning doc update for Opset 9 (pytorch#1805) <Vinitra Swamy>
- **[e71e3be](onnx/onnx@e71e3be)**: add dilation case for ConvTranspose op (pytorch#1797) <Randy>

Reviewed By: yinghai

Differential Revision: D14135024

fbshipit-source-id: 1e4f9dda89abf48994798d080dd5d58207a6e4b6
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support for memory-mapped files
4 participants