You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.
Storage type fallback detected:
operator = add_n
input storage types = [default, default, ]
output storage types = [row_sparse, ]
params = {"num_args" : 2, }
context.dev_mask = cpu
The operator with default storage type will be dispatched for execution. You're seeing this warning message because the operator above is unable to process the given ndarrays with specified storage types, context and parameter. Temporary dense ndarrays are generated in order to execute the operator. This does not affect the correctness of the programme. You can set environment variable MXNET_STORAGE_FALLBACK_LOG_VERBOSE to 0 to suppress this warning.
Traceback (most recent call last):
File "sparse_bug.py", line 30, in <module>
print(embedding.embedding.weight.grad().data)
File "/Users/lllausen/anaconda3/lib/python3.6/site-packages/mxnet/ndarray/sparse.py", line 728, in data
return self._data()
File "/Users/lllausen/anaconda3/lib/python3.6/site-packages/mxnet/ndarray/sparse.py", line 266, in _data
self.wait_to_read()
File "/Users/lllausen/anaconda3/lib/python3.6/site-packages/mxnet/ndarray/ndarray.py", line 1720, in wait_to_read
check_call(_LIB.MXNDArrayWaitToRead(self.handle))
File "/Users/lllausen/anaconda3/lib/python3.6/site-packages/mxnet/base.py", line 210, in check_call
raise MXNetError(py_str(_LIB.MXGetLastError()))
mxnet.base.MXNetError: [19:37:39] src/operator/contrib/../operator_common.h:493: Not implemented: operator = _backward_Embedding
input storage types = [default, default, ]
output storage types = [default, default, ]
params = {"input_dim" : 1000, "output_dim" : 300, "dtype" : float32, "sparse_grad" : True, }
context.dev_mask = cpu
The text was updated successfully, but these errors were encountered:
The problem is that currently is output grad is not the immediate output of _backward_CachedOp, the storage types of _backward_CachedOp outputs are inferred as dense storage. This is because _backward_CachedOp didn't register FInferStorage, producing dense outputs by default.
In this example, the outputs of _backward_CachedOp are passed to add_n to produce row_sparse grad.
To solve this issue, we need to register FInferStorage for _backward_CachedOp, which performs subgraph storage type inference, and return the stype inference result to the caller.
With the mxnet master branch below code will fail. The combination of the following causes the failure:
The text was updated successfully, but these errors were encountered: