You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to acquire the attentions from the pretrained model (rt_1_x_tf_trained_for_002272480_step). In pytorch with LLaVA, for instance, there is an option, e.g., model.output_attentions=True. Is there a similar option in this repo?
From what I could find from browsing the models subfolder, no such functionality exists. Does anyone know how I could go about acquiring the attention matrices for each block?
Moreover, what is the appropriate way to load the model to access its underlying architecture? I am used to keras, but the saved_model format prevents use of keras functionality which would make this task simpler.
Thank you for any information you can provide!
The text was updated successfully, but these errors were encountered:
I am trying to acquire the attentions from the pretrained model (rt_1_x_tf_trained_for_002272480_step). In pytorch with LLaVA, for instance, there is an option, e.g., model.output_attentions=True. Is there a similar option in this repo?
From what I could find from browsing the models subfolder, no such functionality exists. Does anyone know how I could go about acquiring the attention matrices for each block?
Moreover, what is the appropriate way to load the model to access its underlying architecture? I am used to keras, but the saved_model format prevents use of keras functionality which would make this task simpler.
Thank you for any information you can provide!
The text was updated successfully, but these errors were encountered: