You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@gin.configurabledefload_hf_model(model_id):
""" Loads a (local) Hugging Face model from a directory containing a pytorch_model.bin file and a config.json file. """returnTransformerModel(model_id)
# transformers.AutoModel.from_pretrained(model_id)
Comment load_explanations:
# Load the explanations# self.load_explanations(background_dataset=background_dataset)
Intergration
This is an overview how we could intergrate other desired dataset into TTM.
1. Get TTM
2. Setup env
Get into the TTM directory and run these commands:
Then you should all dependencies:
3. Add needed files
Firstly, put the configure file into folder /configs as /configs/boolq.gin
And change the global config files in global_config.gin:
GlobalArgs.config = "./configs/boolq.gin"
Then you should add datasets:
What's more, you should download the model:
https://huggingface.co/andi611/distilbert-base-uncased-qa-boolq/tree/main. And put it under /configs as ./configs/boolq_model.
Adapt original files:
In /explain/logic.py,
4. Execution
The text was updated successfully, but these errors were encountered: