Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: - Causal fidelity problem with steps=-1 #150

Closed
AntoninPoche opened this issue Dec 8, 2023 · 0 comments · Fixed by #152
Closed

[Bug]: - Causal fidelity problem with steps=-1 #150

AntoninPoche opened this issue Dec 8, 2023 · 0 comments · Fixed by #152
Assignees
Labels
bug Something isn't working

Comments

@AntoninPoche
Copy link
Collaborator

Module

Metrics

Current Behavior

When using a causal fidelity metric (either Insertion or Deletion) and setting steps=-1, the method returns an Attribute error.

Expected Behavior

I expect the method to support parameter values indicated in the documentation.

To solve it, in fidelity.py:
Change L306: tf.math.floor(self.nb_features * max_percentage_perturbed)
By: int(np.floor(self.nb_features * max_percentage_perturbed))

Version

1.3.1

Environment

No response

Relevant log output

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-8-ebb4e1c3761e> in <cell line: 7>()
      6 i = 0
      7 for explanation_name, explanations in explanations_to_test.items():
----> 8   deletion_score = metric(explanations)
      9   deletion_scores.append((explanation_name, deletion_score))
     10 

5 frames
/usr/local/lib/python3.10/dist-packages/xplique/metrics/base.py in __call__(self, explanations)
    157                  explanations: Union[tf.Tensor, np.array]) -> float:
    158         """Evaluate alias"""
--> 159         return self.evaluate(explanations)

/usr/local/lib/python3.10/dist-packages/xplique/metrics/fidelity.py in evaluate(self, explanations)
    326             better) curve.
    327         """
--> 328         scores_dict = self.detailed_evaluate(explanations)
    329 
    330         # compute auc using trapezoidal rule (the steps are equally distributed)

/usr/local/lib/python3.10/dist-packages/xplique/metrics/fidelity.py in detailed_evaluate(self, explanations)
    379         baselines_flatten = baselines.reshape(self.inputs_flatten.shape)
    380 
--> 381         steps = np.linspace(0, self.max_nb_perturbed, self.steps + 1, dtype=np.int32)
    382 
    383         if self.causal_mode == "deletion":

/usr/local/lib/python3.10/dist-packages/numpy/core/overrides.py in linspace(*args, **kwargs)

/usr/local/lib/python3.10/dist-packages/numpy/core/function_base.py in linspace(start, stop, num, endpoint, retstep, dtype, axis)
    118 
    119     """
--> 120     num = operator.index(num)
    121     if num < 0:
    122         raise ValueError("Number of samples, %s, must be non-negative." % num)

/usr/local/lib/python3.10/dist-packages/tensorflow/python/framework/ops.py in __index__(self)
    297 
    298   def __index__(self):
--> 299     return self._numpy().__index__()
    300 
    301   def __bool__(self) -> bool:

AttributeError: 'numpy.float64' object has no attribute '__index__'

To Reproduce

Initialize Insertion or Deletion with steps=-1 and call either evaluate or detailed_evaluate. It can easily be done in the Metrics: Getting started tuto.

@AntoninPoche AntoninPoche added the bug Something isn't working label Dec 8, 2023
@AntoninPoche AntoninPoche self-assigned this Dec 12, 2023
@AntoninPoche AntoninPoche linked a pull request Dec 12, 2023 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant