Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

grad is working, jacobian is not #64

Closed
brunojacobs opened this issue Nov 5, 2015 · 7 comments
Closed

grad is working, jacobian is not #64

brunojacobs opened this issue Nov 5, 2015 · 7 comments

Comments

@brunojacobs
Copy link

First of all, autograd is an amazing tool! The grad function works great, but I am not able to make the jacobian function work. I am on Python 3.4.3, numpy 1.10.1 and autograd 1.1.1 (pip).

The call to the jacobian function works, but the function it returns does not accept my argument and throws a TypeError: The first input argument needs to be a sequence

The error seems to occur in the concatenate function, defined at line 44 in autograd/core.py (although I do believe this error is actually an error raised by numpy).

A minimum example to reproduce this:

import autograd.numpy as np
from autograd import grad
from autograd import jacobian

x = np.array([5, 3], dtype=float)

def cost(x):
    return x[0]**2 / x[1] - np.log(x[1])

grad_cost = grad(cost)
jaco_cost = jacobian(cost)

print(grad_cost(x)) # works correctly!
print(jaco_cost(x)) # error

Another unrelated question (I am not sure if this is the correct place for this): If I wrote the above cost function in a separate module that imports numpy by itself (so not the autograd.numpy). Is it possible to somehow import this function from the module directly, instead of copy-pasting the code? Right now I get the following AttributeError: 'FloatNode' object has no attribute 'log'

The import is as follows:

import autograd.numpy as np
from autograd import grad

#This module contains an 'import numpy as np' line
from my_module import cost

x = np.array([5, 3], dtype=float)
#Function from 'my_module' for which I'd like to calculate the gradient
grad_cost = grad(cost)
grad_cost(x) # error

If I remove the np.log call from my cost function the error disappears. So I guess I somehow need to override the 'regular' numpy import with the autograd numpy import, if possible.

@mattjj mattjj closed this as completed in aeedf19 Nov 5, 2015
@mattjj
Copy link
Contributor

mattjj commented Nov 5, 2015

As to your question, you need the cost function (and anything it calls that uses numpy code) to use autograd's wrapped numpy. The easiest fix would be to change the numpy import statement in my_module.py to use autograd.numpy.

As a hack, your code could munge the numpy module before importing my_module, as in

import autograd.numpy as np
from autograd import grad

import numpy as unwrapped_numpy
unwrapped_numpy.log = np.log
from my_module import cost

The import of my_module has to happen after munging the loaded numpy module in case any numpy functions are bound directly to names in my_module.py, i.e. in case there are lines in my_module.py like from numpy import log. If the code in my_module.py only uses import numpy as np and references numpy functions through that module reference, then the import order doesn't matter.

This hack is not a good long-term recipe, and it might even depend on CPython module loading details. (I didn't test it, but something along these lines should work.)

@mattjj
Copy link
Contributor

mattjj commented Nov 5, 2015

Actually, I wasn't able to test this fix (the error doesn't happen on my Python 2.7.10 setup), so I'm reopening the issue until I can test it later today. If you notice it's still broken, please paste the full error traceback.

@mattjj mattjj reopened this Nov 5, 2015
@brunojacobs
Copy link
Author

Thank you for the swift response! I tested the fix but I don't think it works.. (Note that I manually hacked fix #64 in my code, as I am using pip. But from the error output you can see that the change is there).

Here is the full error traceback:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-1-d12611e5619b> in <module>()
     14 
     15 print(grad_cost(x)) # works correctly!
---> 16 print(jaco_cost(x)) # error

/Users/bruno/anaconda3/lib/python3.4/site-packages/autograd/core.py in gradfun(*args, **kwargs)
     51         grads = list(map(partial(backward_pass, start_node, tape=tape), end_nodes))
     52         shape = dummy.outshape + getshape(args[argnum])
---> 53         return np.reshape(concatenate(grads), shape) if shape else grads[0]
     54     return gradfun
     55 

/Users/bruno/anaconda3/lib/python3.4/site-packages/autograd/core.py in <lambda>(lst)
     42         return list(np.ravel(val))
     43 
---> 44     concatenate = lambda lst: np.concatenate(map(np.atleast_1d, lst))
     45 
     46     @attach_name_and_doc(fun, argnum, 'Jacobian')

TypeError: The first input argument needs to be a sequence

As regarding my question, I will look into your proposed hack solution, thanks!

@mattjj
Copy link
Contributor

mattjj commented Nov 5, 2015

Oh, the traceback shows the problem! The call to map on line 44 needs a call to list wrapped around it. That's how it was in the git master, but apparently the pip version doesn't include that fix.

I'll revert the change I made; it wasn't necessary because the fix is already in master. We'll have to update the pip version.

I'm going to close the issue because the bug doesn't exist on master, and we'll update the pip version at some point soon.

@mattjj mattjj closed this as completed Nov 5, 2015
@brunojacobs
Copy link
Author

Perfect, it works and sorry for the confusion.

For other who might be interested, I used pip to install the latest master as follows:
pip install git+git://github.com/HIPS/autograd.git

@brunojacobs
Copy link
Author

I'm not sure whether this is the right place for this follow up (as the original issue has been closed) but I put some more thought into my question regarding switching between the stripped down autograd.numpy and the original numpy. This is what I came up with.

My my_module.py starts as follows:

import config

if not config.AUTOGRAD:
    import numpy as np
    from scipy import special
else:
    import autograd.numpy as np
    from autograd.scipy import special

# rest of my_module remains unchanged.

The config module that is imported contains the following boolean:

# Set autograd to active or not
AUTOGRAD = True

If is True, the autograd.numpy is loaded, else the normal numpy is loaded.

For my use case, this means I no longer have to copy paste functions from my_module.py to an external Python file. I can just set AUTOGRAD to True and import the functions on which I want to apply autograd methods. I believe that this handles my situation pretty well, as I don't impose any restrictions on the functions within my_module.py. If I want to use autograd on any of them, I can now directly import these functions if AUTOGRAD is set to True.

The price I pay is that if AUTOGRAD is set to True, then functions in my_module.py that rely on contents from 'regular numpy' that is not in autograd.numpy won't work. But at least I implicitly acknowledged this by setting AUTOGRAD to True.

Another option could be to move all the functions in my_module.py that are autograd 'related' to a different module, but to me that feels like I'm changing too much in the structure of my code. This seems like a nice solution that handles my situation pretty well, with very limited impact on my current codebase.

@mattjj
Copy link
Contributor

mattjj commented Nov 5, 2015

That seems like a reasonable enough solution. Of course, if there are specific things in 'regular numpy' that are not in autograd.numpy, you can let us know by opening an issue!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants