magic who command works, but 'whos' gives an error that has to do with numpy
/IPython/core/magic.py in magic_whos(self, parameter_s) line 870
AttributeError: 'module' object has no attribute 'ndarray'
This version of pypy has a trial version of numpy that only has 1d arrays, not ndarray.
There's a troubling implication that we have to allow that numpy may not really be numpy. We check for numpy in quite a few places, I think, because it's such a basic package.
I guess to be friendly to PyPy, we would have to make sure that we never consider numpy present in that context.
I don't really think PyPy should be calling this library 'numpy', at least until it's substantially equivalent to numpy. I know quite a few of the tests break on PyPy because of this as well.
True, it would probably be better if they left it as micronumpy while they are in these extremely early stages in development, and especially if their goal is not 100% API coverage.
In any case, I acked the source and we really don't deal with numpy that much, except in the parallel code where PyPy support is a long way off (if ever), given that it all starts with pyzmq.
Changing this whos check is a two-line fix. magic_precision also checks for numpy assuming it's real, and can be made to be a more direct check for the function it calls.
There was a GSoC project to allow Cython to produce pure Python output with ctypes for compiled libraries, which would work on PyPy. It didn't get finished, but hopefully someone will pick it up.
The simplest fix, at least in magic.py is to set ndarry_type to None if there is an AttributeError. If the user does create a pypy numpy variable, it will be displayed like 'Everything else'. I have fudged summary information for numarray type (e.g. there's no size or itemsize attribute), but that's aiming at a moving target.
I agree that it would cleaner to not even import numpy if the user has not already done so, and created a ndarray variable, but I'm not sure if that can be done simply and cleanly (without risk of breaking something else).
Yes, I think we will go with the more explicit check for the attribute, it's easy enough.
Make import checks more explicit in %whos
Since PyPy has a fake numpy, checking for 'numpy' is insufficient
only check if numpy/numeric in use, to prevent unnecessary imports
core/formatters.py also uses numpy in
It checks if 'numpy' in sys.modules. In pypy numpy is part of builtins, so this always True, regardless of whether the user has imported it or not.
magic command gets an error:
AttributeError: 'module' object has no attribute 'set_printoptions'
because the micro numpy does not have printoptions.
Yes, I know. I'm somewhat inclined to leave it that way, because it is a PyPy bug that they deliver a package called numpy that does not provide any of numpy's functionality.
You have a good point. The whos error jumped out right away. I had to dig for the precision one.
There's a raging debate about this:
and this bug is a good data point we should provide. Pypy really should not be calling their library numpy unless it provides the real numpy API. If any of you can jump in and at least give them this info, that would be awesome...
Yes, I've been following that, and I'm pretty sure the name issue has also been brought up on pypy mailing lists. I don't think adding "hey, here's another thing you haven't gotten around to yet" to the noise will be particularly informative.
It's also true that the %whos error comes first and is unconditional, so the magic was inaccessible for all PyPy users. The %precision error is already conditional on the presence of numpy in sys.modules, so it only affects PyPy users already using micronumpy, and it also happens to be at the end of the block, so the magic still has the desired effect, even with the traceback.
It's also relevant to note that pypy absolutely does aim for 100% numpy Python API compatibility, so these errors (and anything seen from IPython itself) just mean they aren't done, which makes sense since they are just getting started.
Much of the big debate is about the fact that numpy really has two apis - Python and C, and how useful a numpy without the C-API can be (in many areas, the answer is not at all, but in others it doesn't matter). The obvious problem being that in CPython, if you have numpy you get a whole universe of tools that can talk to each other (see Travis' various presentations on numpy as Lingua Franca), but if your numpy implementation only provides the Python API, you lose a huge fraction of that universe.
If I had any relevantly useful skills, I would jump right in to the Cython -> Python+ctypes project, so I could get pyzmq on PyPy.
The nuisance with the current pypy numpy is 2 fold:
its API is nonstandard, missing functions and attributes like ndarray and set_printoptions
it is a builtin that appears in sys.modules regardless of whether the user has imported it or not. Is there a way of checking whether a builtin has been imported or not (other than looking in globals)?
Simply changing it's name back to something like micronumpy is probably enough to avoid these conflicts.
Using an explicit format like
works fine because it does not try to do anything with numpy
In pypy 1.7 "NumPy effort in PyPy was renamed numpypy."
@minrk, do you want to go back to the simpler code without the defensive check, or leave it as it is now in case something else masquerading as numpy is found for some reason in sys.modules?
I think the current check is better anyway, might as well leave it.
@hpaulj thanks for the heads up! I think the change will be better for a lot of people until they get closer to API completeness.