Skip to content
This repository has been archived by the owner on Oct 15, 2019. It is now read-only.

second grad error #165

Open
kernel8liang opened this issue Mar 24, 2017 · 5 comments
Open

second grad error #165

kernel8liang opened this issue Mar 24, 2017 · 5 comments

Comments

@kernel8liang
Copy link

Run autograd_tutorial example,

https://github.com/dmlc/minpy/blob/master/examples/tutorials/autograd_tutorial.ipynb

import matplotlib.pyplot as plt
%matplotlib inline
plt.rcParams['figure.figsize'] = (10.0, 8.0) # set default size of plots
plt.rcParams['image.interpolation'] = 'nearest'
plt.rcParams['image.cmap'] = 'gray'

x = np.linspace(-10, 10, 200)
# plt.plot only takes ndarray as input. Explicitly convert MinPy Array into ndarray.
plt.plot(x.asnumpy(), foo(x).asnumpy(),
         x.asnumpy(), d_foo(x).asnumpy(),
         x.asnumpy(), d_2_foo(x).asnumpy(),
         x.asnumpy(), d_3_foo(x).asnumpy())
plt.show()

error below

AttributeError                            Traceback (most recent call last)
<ipython-input-5-317bda464833> in <module>()
      9 plt.plot(x.asnumpy(), foo(x).asnumpy(),
     10          x.asnumpy(), d_foo(x).asnumpy(),
---> 11          x.asnumpy(), d_2_foo(x).asnumpy(),
     12          x.asnumpy(), d_3_foo(x).asnumpy())
     13 plt.show()

AttributeError: 'float' object has no attribute 'asnumpy'
In [2]: mxnet.__version__
Out[2]: '0.9.4'

minpy version 0.33.

@wangg12
Copy link
Contributor

wangg12 commented Mar 24, 2017

Related to #155, any fixes on this?

@lryta
Copy link
Member

lryta commented Mar 24, 2017

I will try to make a temporary solution in the weekend. In long term this part will be replaced by MXNet's NDArray subsystem which we are working on now.

@jermainewang
Copy link
Member

jermainewang commented Mar 24, 2017 via email

@lryta
Copy link
Member

lryta commented Mar 26, 2017

@jermainewang I found the reason now. Look at here: link. The function pushed into gradient record is unwrapped version. Which means that tape cannot record operations there. I think this change is due to performance concern. Should we fix it or wait autograd runtime? (By the way, does autograd runtime support higher order derivatives?

@Taco-W
Copy link
Member

Taco-W commented May 31, 2017

@lryta Do we have any follow-up on this?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants