Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Q]Error when applying TSR on MTS #47

Closed
R-Z78 opened this issue Sep 20, 2023 · 6 comments
Closed

[Q]Error when applying TSR on MTS #47

R-Z78 opened this issue Sep 20, 2023 · 6 comments

Comments

@R-Z78
Copy link

R-Z78 commented Sep 20, 2023

Hello again! Thanks for your attention!
I'm encountering some errors during the execution of TSR. I think I have made input data in the correct data type and shape. However, the code isn't running successfully. Here is the code and the error message:

import h5py
import tensorflow as tf
# Load model
model_to_explain = tf.keras.models.load_model("best_model_weights.h5")

from TSInterpret.InterpretabilityModels.Saliency.TSR import TSR

#int_mod=TSR(model_to_explain, test_x.shape[-1], test_x.shape[-2], method='GRAD', mode='feat')
int_mod=TSR(model_to_explain, test_x.shape[-2],test_x.shape[-1], method='IG',mode='time', device='cuda')

item= test_x[0:1,:,:]
print(item.shape)

label=int(test_y[0])
print(label)

exp=int_mod.explain(item,labels=label,TSR = True)

%matplotlib inline  
int_mod.plot(np.array([test_x[0:1,:,:]]),exp)
Mode in TF Saliency time
(1, 10, 91)
1
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[30], line 18
     15 label=int(test_y[0])
     16 print(label)
---> 18 exp=int_mod.explain(item,labels=label,TSR = True)
     20 get_ipython().run_line_magic('matplotlib', 'inline')
     21 int_mod.plot(np.array([test_x[0:1,:,:]]),exp)

File ~\AppData\Local\anaconda3\envs\TSInterpret\lib\site-packages\TSInterpret\InterpretabilityModels\Saliency\SaliencyMethods_TF.py:107, in Saliency_TF.explain(self, item, labels, TSR)
    105 if self.method == "IG" or self.method == "GRAD" or self.method == "SG":
    106     input = input.reshape(-1, self.NumTimeSteps, self.NumFeatures, 1)
--> 107     attributions = self.Grad.explain(
    108         (input, None), self.model, class_index=labels
    109     )
    110 elif self.method == "DLS" or self.method == "GS":
    111     self.Grad = self.Grad(self.model, input)

File ~\AppData\Local\anaconda3\envs\TSInterpret\lib\site-packages\tf_explain\core\integrated_gradients.py:40, in IntegratedGradients.explain(self, validation_data, model, class_index, n_steps)
     34 images, _ = validation_data
     36 interpolated_images = IntegratedGradients.generate_interpolations(
     37     np.array(images), n_steps
     38 )
---> 40 integrated_gradients = IntegratedGradients.get_integrated_gradients(
     41     interpolated_images, model, class_index, n_steps
     42 )
     44 grayscale_integrated_gradients = transform_to_normalized_grayscale(
     45     tf.abs(integrated_gradients)
     46 ).numpy()
     48 grid = grid_display(grayscale_integrated_gradients)

File ~\AppData\Local\anaconda3\envs\TSInterpret\lib\site-packages\tensorflow\python\util\traceback_utils.py:153, in filter_traceback.<locals>.error_handler(*args, **kwargs)
    151 except Exception as e:
    152   filtered_tb = _process_traceback_frames(e.__traceback__)
--> 153   raise e.with_traceback(filtered_tb) from None
    154 finally:
    155   del filtered_tb

File ~\AppData\Local\Temp\__autograph_generated_file1omsyq8b.py:26, in outer_factory.<locals>.inner_factory.<locals>.tf__get_integrated_gradients(interpolated_images, model, class_index, n_steps)
     24     ag__.converted_call(ag__.ld(tape).watch, (ag__.ld(inputs),), None, fscope)
     25     predictions = ag__.converted_call(ag__.ld(model), (ag__.ld(inputs),), None, fscope)
---> 26     loss = ag__.ld(predictions)[:, ag__.ld(class_index)]
     27 grads = ag__.converted_call(ag__.ld(tape).gradient, (ag__.ld(loss), ag__.ld(inputs)), None, fscope)
     28 grads_per_image = ag__.converted_call(ag__.ld(tf).reshape, (ag__.ld(grads), (-1, ag__.ld(n_steps), *ag__.ld(grads).shape[1:])), None, fscope)

ValueError: in user code:

    File "C:\Users\rz124\AppData\Local\anaconda3\envs\TSInterpret\lib\site-packages\tf_explain\core\integrated_gradients.py", line 71, in get_integrated_gradients  *
        loss = predictions[:, class_index]

    ValueError: slice index 1 of dimension 1 out of bounds. for '{{node strided_slice}} = StridedSlice[Index=DT_INT32, T=DT_FLOAT, begin_mask=1, ellipsis_mask=0, end_mask=1, new_axis_mask=0, shrink_axis_mask=2](sequential/dense/Sigmoid, strided_slice/stack, strided_slice/stack_1, strided_slice/stack_2)' with input shapes: [10,1], [2], [2], [2] and with computed input tensors: input[1] = <0 1>, input[2] = <0 2>, input[3] = <1 1>.

Do you have any insights into what might be causing this issue? Any suggestions on how to resolve it would be greatly appreciated. Thank you in advance!

@JHoelli
Copy link
Contributor

JHoelli commented Sep 20, 2023

Hi @Ricky-zhang9678,

For me it seem like your model output shape does not return a softmax.
Can you maybe check the shape of your model.predict ?

Just for my understanding your input has the shape (1,feat, timestep)?

@R-Z78
Copy link
Author

R-Z78 commented Sep 20, 2023

Hi @Ricky-zhang9678,

For me it seem like your model output shape does not return a softmax. Can you maybe check the shape of your model.predict ?

Just for my understanding your input has the shape (1,feat, timestep)?

Yes, you are right. My model output shape does not return a softmax. I used sigmoid as the activation function in the last layer cause I am doing binary classification. Maybe that's the reason for error.
The input here, i,e,. item has the shape (1,10,91), 10 refers timestep, 91 refers features.
Thx!

@JHoelli
Copy link
Contributor

JHoelli commented Sep 21, 2023

Hi @Ricky-zhang9678,

Unfortunately, TSR (as most algorithms included in this library) does currently not work with the sigmoid activation function for binary classification. TSR largely relies on tf-explain that needs a class index for gradient calculations (see the Issue here).

To make it TSR work with sigmoid function there might be the possibility of writing a wrapper for the model that parses one sigmoid neuron into a softmax layer with two neurons. However, I am not sure about the side effects (and would need time to think about it). Therefore, I would advise you to use two output neurons, either with softmax or sigmoid.

@R-Z78
Copy link
Author

R-Z78 commented Sep 22, 2023

Hi @Ricky-zhang9678,

Unfortunately, TSR (as most algorithms included in this library) does currently not work with the sigmoid activation function for binary classification. TSR largely relies on tf-explain that needs a class index for gradient calculations (see the Issue here).

To make it TSR work with sigmoid function there might be the possibility of writing a wrapper for the model that parses one sigmoid neuron into a softmax layer with two neurons. However, I am not sure about the side effects (and would need time to think about it). Therefore, I would advise you to use two output neurons, either with softmax or sigmoid.

Thanks for your reply and help! I've attempted to change my model output to softmax, and TSInterpret has successfully executed. Here, you can find visualizations for features 0, 1, and 2. I'm still encountering an issue concerning the white line on the map. Could you please clarify what it signifies when it's flat and when it exhibits an upward or downward trend?
image

@JHoelli
Copy link
Contributor

JHoelli commented Sep 22, 2023

hm interesting, so the white line is suppose to be the original time series. If your original time series is incorrect: There was a bug in the plot function of an older version of TSInterpret (<= 0.3.4), so maybe you would need to update your library version (>=0.4.0).

@R-Z78
Copy link
Author

R-Z78 commented Sep 22, 2023

Thanks! I am going to close the issue. Very thanks for your attention!

@R-Z78 R-Z78 closed this as completed Sep 22, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants