New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Thoughts on making the numpy dependency optional? #203
Comments
I think this is entirely possible as we have a fairly weak dependance on NumPy beyond testing. Feel free to take a crack at it, I can look into removing NumPy this weekend. |
Yes I agree this would be a nice thing to do. From what I can tell minor problem points where one can't just import/mock numpy lazily are:
|
Nice check!
if len(n) > 1000 and has_numpy:
return _numpy_impl(*args)
else:
return _python_impl(*args) The library isn't strongly type hinted yet (still plenty of |
I actually realized you can implement import bisect
def ssa_to_linear_bis(ssa_path, N=None):
if N is None:
N = sum(map(len, ssa_path)) - len(ssa_path) + 1
ids = list(range(N))
path = []
ssa = N
for scon in ssa_path:
con = sorted([bisect.bisect_left(ids, s) for s in scon])
for j in reversed(con):
ids.pop(j)
ids.append(ssa)
path.append(con)
ssa += 1
return path this is significantly faster than the numpy version throughout the range. |
Proposal
Make the numpy dependency optional, if possible.
Why?
Minimizing dependencies is a general goal as it allows a bigger audience to reap the benefits of this library. More specifically, some of us are interested in making opt_einsum a hard dependency for torch, but we would like to keep numpy unrequired. (If you're curious why torch does not have a hard dependency on numpy, see pytorch/pytorch#60081, tl;dr being the last comment.)
A hard dependency would mean all torch users would get the benefits of opt_einsum right away without thinking too hard/needing to manually install opt_einsum themselves.
Alternatives
We could also have torch vendor in opt_einsum, but that is increased complexity/maintenance + we would like to automatically subscribe to improvements in opt_einsum!
The text was updated successfully, but these errors were encountered: