-
-
Notifications
You must be signed in to change notification settings - Fork 30.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding a SyncManager Queue proxy to a SyncManager dict or Namespace proxy raises an exception #74442
Comments
In multiprocessing, attempting to add a Queue proxy to a dict or Namespace proxy (all returned by the same SyncManager) raises an exception indicating a keyword argument 'manager_owned=True' has been passed to the function AutoProxy() but is not an argument of that function. In lib/python3.6/multiprocessing/managers.py, in function RebuildProxy(), line 873: "kwds['manager_owned'] = True" adds this argument to a keyword argument dictionary. This function calls AutoProxy which has an argument list defined on lines 909-910 as: |
I encountered this bug while testing the code in this StackOverflow answer:
The code at the end of the answer runs on Python 3.5, but fails on 3.6 with the "unexpected keyword argument 'manager_owned'" error. If someone knows of a workaround until the PR is accepted, it would be appreciated as well. |
The issue is also present in Python 3.7.0b1. |
This is still an issue: https://stackoverflow.com/questions/56716470/python-multiprocessing-nested-shared-objects-doesnt-work-with-queue Is there a specific reason, why #4819 doesn't get reviewed? |
I am getting this error now, too. I'm not sure what's causing it - when I use multiprocessing with a Manager.Queue, if I am passing around an object on the queue stack, I get this error. Would really like it to "just work". |
Are there any reasons why it does not get merged? |
Might it be better to just *drop* the AutoProxy object altogether? All that it adds is a delayed call to MakeProxyType(f"AutoProxy[{typeid}]", exposed) (with exposed defaulting to public_methods(instance)), per instance per process. It could be replaced by a direct call to The advantage is that it would simplify the codebase; no more need to special-case the BaseProxy.__reduce__ method, removing the get_methods() method on the Server class, etc. Less surface for this class of bugs to happen in the future. |
I'm going to merge both PRs. |
(The original PR was too stale to merge, so I just merged the combined PR and added a Co-Authored-By header.) Now doing the backports. |
Merged and backported to 3.10, 3.9. Let's forget about 3.8 or earlier (Lukasz removed the needs-backport-to-3.8 and -3.7 labels from #60545 on May 4). I should note that landing this was not a great experience:
The one positive experience was Martijn Pieters' long explanation on StackOverflow of what went wrong and why this was the correct fix. (Maybe he elaborated a bit much on the monkeypatch. :-) This treatise gave me the confidence that the fix was correct (enough) and should be merged. Parting shot: IMO we should not have accepted multiprocessing into the stdlib. It is a very useful module, but very complex, and would have been better off as a third-party module, with a more focused crew of maintainers and a quicker release cycle. (Almost everyone who uses multiprocessing needs to install other packages anyway.) |
Oh no! Was there a test for this that I ignored? Thanks for cleaning up |
Just chiming in to say that this is still broken for me on Python 3.9.6, Win10/64: https://pastebin.com/64F2iKaj But, works for 3.10.0b4. |
From what I understand, the patch landed on 2021-07-02, 4 days after Python 3.9.6's release date of 2021-06-28, so it wasn't included. It should come in 3.9.7, which should be out on 2021-08-30 according to PEP-596. |
Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.
Show more details
GitHub fields:
bugs.python.org fields:
The text was updated successfully, but these errors were encountered: