You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
With David we found the origin of the problem and a temporary solution :
The problem comes from the shape of the prediction over one batch. In this case, if the model have only one output, the shape of the prediction will be (n_sample, ), which only have one dimension, the dimension corresponding to the dimension of the output disappear.
The temporary solution is thus to expend the dimension of the prediction in the wrapper :
# Create the wrapper class
class Wrapper():
# The init method is necessary for every class
def __init__(self, model):
self.model = model
# The call method calls the predict method
def __call__(self, inputs):
pred = self.model.predict(inputs)
pred = tf.expand_dims(pred, axis=1)
return pred
However, for me, this is only temporary as the library should be as user friendly as possible. Thus I suggest to add a few lines in the function xplique.common.callable_operations.predictions_ont_hot_callable() to verify if we are in the case of a batch were the output dimension disappeared and expand the dimension in this case.
name: Bug report
about: attributions of sklearn wrapped models
title: "[BUG] - attributions of sklearn wrapped models are incoherent"
labels: ''wrapper", "bug", "attributions"
assignees: ''
Select the modules to which the bug refers:
Describe the bug
When a wrapper is used on regression model from sklearn, the obtained attributions are not coherent. (Problem can be larger).
Screenshots
Desktop:
To Reproduce
To ease the debugging, here is a minimal example notebook (There are several lines for the visualization, but you can jump to the end of the notebook to see the different attributions) : https://colab.research.google.com/drive/1zvt4I9lVpvzg1oWPUNZoFs39w8MPSz_b?usp=sharing
Expected behavior
The 4 last attributions values should be close to 0, far inferior to the 4 first.
Additional context
_
The text was updated successfully, but these errors were encountered: