You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was trying to use the function get_target_contribs_func for my model built with keras. However, I got following error:
RuntimeError: There is a layer after your target layer but it is not an activation layer, which seems odd...if doing regression, make sure to set the target layer to the last layer
My last layer is a linear layer, it was generated by keras.layers.core.Activation("linear"). This error will disappear when changing "linear" to "sigmoid", So I guess "linear" is not supported by DeepLIFT as last layer. Would you add it in the future? Or it's just something wrong with my model?
The text was updated successfully, but these errors were encountered:
When calling deeplift_model.get_target_contribs_func, the default value for target_layer_idx is -2. You should set it to -1 if your last layer is a linear layer, as this is the layer that you want to measure contributions to (as recommended in the deeplift paper, if your last layer were a sigmoid, then contributions should be measured w.r.t the linearity immediately preceding the final sigmoid transformation). The error is intended to catch cases where people are not aware that target_layer_idx defaults to -2.
Hello,
I was trying to use the function
get_target_contribs_func
for my model built with keras. However, I got following error:RuntimeError: There is a layer after your target layer but it is not an activation layer, which seems odd...if doing regression, make sure to set the target layer to the last layer
My last layer is a linear layer, it was generated by
keras.layers.core.Activation("linear")
. This error will disappear when changing "linear" to "sigmoid", So I guess "linear" is not supported by DeepLIFT as last layer. Would you add it in the future? Or it's just something wrong with my model?The text was updated successfully, but these errors were encountered: