New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Same symbols created in different processes are not resolved as being equal #21121
Comments
I noticed that if I remove the extra assumptions (i.e. use |
Here is an example of a minimal unit test which reveals the bug and indeed shows that the issue is with the pickling. |
Without |
It seems that multiprocessing somehow finds a global symbol (at least on my system). I can see that by appending
to the code in #21121 (comment). |
I'm not sure I've completely figured out how Symbol works. The problem seems to be that the This seems to fix it (using diff --git a/sympy/core/basic.py b/sympy/core/basic.py
index dce161a2b2..a9bf432a9e 100644
--- a/sympy/core/basic.py
+++ b/sympy/core/basic.py
@@ -121,20 +121,6 @@ def __new__(cls, *args):
def copy(self):
return self.func(*self.args)
- def __reduce_ex__(self, proto):
- """ Pickling support."""
- return type(self), self.__getnewargs__(), self.__getstate__()
-
- def __getnewargs__(self):
- return self.args
-
- def __getstate__(self):
- return {}
-
- def __setstate__(self, state):
- for k, v in state.items():
- setattr(self, k, v)
-
def __hash__(self):
# hash cannot be cached using cache_it because infinite recurrence
# occurs as hash is needed for setting cache dictionary keys
diff --git a/sympy/core/symbol.py b/sympy/core/symbol.py
index 41b3c10672..56f9b3e6b8 100644
--- a/sympy/core/symbol.py
+++ b/sympy/core/symbol.py
@@ -300,11 +300,8 @@ def __new_stage2__(cls, name, **assumptions):
__xnew_cached_ = staticmethod(
cacheit(__new_stage2__)) # symbols are always cached
- def __getnewargs__(self):
- return (self.name,)
-
- def __getstate__(self):
- return {'_assumptions': self._assumptions}
+ def __getnewargs_ex__(self):
+ return ((self.name,), self.assumptions0)
def _hashable_content(self):
# Note: user-specified assumptions not hashed, just derived ones |
It seems to me that there is a cache problem. Multiprocessing will modify a global symbol in sympy.functions.combinatorial.numbers. This code
gives this output on my system
|
Hello,
When I try to create symbols (and by extension expressions) in different processes, SymPy somehow does not detect that the symbols are the same even though they have the same name and assumptions.
As an example, consider the following code snippet and the respective output:
Output:
@oscarbenjamin thinks this may be related to pickling and unpickling the symbol. Working in the same process creating two different symbols returns the
exact same object:
I also tried to explicitly pickle and unpickle the symbols using the
dill
library, but this also didn't help.Interestingly, if I obtain two expressions (separately) from different processes, and one is integrand
f
and the other is expected integralF
(both containing only one free symbol,x
), SymPy manages to resolve thatsimplify(F.diff() - f) == 0
andsimplify(integrate(f) - F) == 0
. Note that I do not pass the symbolx
with respect to which to differentiate or integrate. If I do it, it fails. Unfortunately, I don't have a small enough code snippet readily prepared to exemplify this behaviour.The text was updated successfully, but these errors were encountered: