-
-
Notifications
You must be signed in to change notification settings - Fork 711
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] Enable Skorch+Dask-ML #5748
base: main
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,42 @@ | ||
import skorch | ||
|
||
from . import pickle | ||
from .serialize import dask_deserialize, dask_serialize | ||
|
||
|
||
@dask_serialize.register(skorch.NeuralNet) | ||
def serialize_skorch(x, context=None): | ||
protocol = (context or {}).get("pickle-protocol", None) | ||
headers = {} | ||
has_module = hasattr(x, "module_") | ||
if has_module: | ||
module = x.__dict__.pop("module_") | ||
# module's is an interactively defined class on client so its namespace is often `__main__` . | ||
# Pickle has problems pickling when interactively defined classes when they are | ||
# set as an attributes of another object. | ||
# By pickling it on its own we are able to serialize successfully | ||
frames = [None] | ||
buffer_callback = lambda f: frames.append(memoryview(f)) | ||
frames[0] = pickle.dumps(x, buffer_callback=buffer_callback, protocol=protocol) | ||
headers["subframe-split"] = i = len(frames) | ||
frames.append(None) | ||
frames[i] = pickle.dumps( | ||
module, buffer_callback=buffer_callback, protocol=protocol | ||
) | ||
x.__dict__["module_"] = module | ||
else: | ||
frames = [None] | ||
buffer_callback = lambda f: frames.append(memoryview(f)) | ||
frames[0] = pickle.dumps(x, buffer_callback=buffer_callback, protocol=protocol) | ||
|
||
return headers, frames | ||
|
||
|
||
@dask_deserialize.register(skorch.NeuralNet) | ||
def deserialize_skorch(header, frames): | ||
i = header.get("subframe-split") | ||
model = pickle.loads(frames[0], buffers=frames[1:i]) | ||
if i is not None: | ||
module = pickle.loads(frames[i], buffers=frames[i + 1 :]) | ||
model.module_ = module | ||
return model |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,42 @@ | ||
import pytest | ||
|
||
skorch = pytest.importorskip("skorch") | ||
torch = pytest.importorskip("torch") | ||
|
||
from distributed import Client | ||
from distributed.protocol import deserialize, serialize | ||
|
||
|
||
def test_serialize_deserialize_skorch_model(): | ||
|
||
client = Client(processes=True, n_workers=1) | ||
|
||
class MyModule(torch.nn.Module): | ||
def __init__(self, num_units=10): | ||
super().__init__() | ||
self.dense0 = torch.nn.Linear(20, num_units) | ||
|
||
def forward(self, X, **kwargs): | ||
return self.dense0(X) | ||
|
||
net = skorch.NeuralNetClassifier( | ||
MyModule, | ||
max_epochs=10, | ||
iterator_train__shuffle=True, | ||
) | ||
|
||
def test_serialize_skorch(net): | ||
net = net.initialize() | ||
return deserialize(*serialize(net)) | ||
|
||
# We test on a different worker to ensure that | ||
# errors skorch serialization faces on a different process | ||
# other than the client due to lack of __main__ context | ||
# are actually resolved | ||
# See isssue for context | ||
# https://github.com/dask/dask-ml/issues/549#issuecomment-669924762 | ||
|
||
deserialized_net = list(client.run(test_serialize_skorch, net).values())[0] | ||
assert isinstance(deserialized_net.module_, MyModule) | ||
Comment on lines
+39
to
+40
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Was the best test i could come up with for testing on the worker . Please let me know is there is a better way to check serialization in a different process. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think we would benefit from a roundtrip serialization test like some of the others in this directory (without a cluster) to make sure that is working as expected. Know that doesn't show the error per-se, but it will help catch other errors in the future In terms of testing a worker, would take a look at some of the other tests. Maybe like this one? Then adapt that to your use case. We shouldn't need to manually do the serialization ourselves, but instead rely on Dask to do that for us and merely check that things work as expected |
||
|
||
client.close() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Curious why
module_
can't be pickled on its own. Is there any more info on the issues encountered by leaving this?Also any downside to (temporarily) modifying a user-provided object here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So module's is an interactively defined class on client so its namespace is often
__main__
. For eg.__main__. MyModule
.Pickle has problems pickling when interactively defined classes when they are set as an attributes of another object. as it tries to look up the class in the namespace. See eg for trace.
By pickling it on its own we are able to serialize successfully
The only side effect i can think of is if the class is redefined in the worker's name-space causing undefined behavior while de-serializing on the worker. I doubt that will really happen in real workflows.
FWIW, I have added a test to verify that at-least for the class is the same after deserialization.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this just an issue with
pickle
? Doescloudpickle
run into this issue or does it work ok?