torch.distributed.rpc package not work well with generator and lambda #42705
Labels
module: rpc
Related to RPC, distributed autograd, RRef, and distributed optimizer
triaged
This issue has been looked at a team member, and triaged and prioritized into an appropriate module
I'm using torch.distributed.rpc package to work on a distributed training POC, currently I'm seeing rpc package itself is using pickle and pickle not work well with some python features like generator and lambda, so that put extra limitation, what even worse is if I try to reference some other python package that used lambda or generator I have no way to combine usage of this and rpc package.
So my question is:
Thanks for any suggestions.
cc @pietern @mrshenli @pritamdamania87 @zhaojuanmao @satgera @gqchen @aazzolini @rohan-varma @xush6528 @jjlilley @osalpekar @jiayisuse @agolynski
The text was updated successfully, but these errors were encountered: