You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I initialize KnowledgeNeurons with model_name = 'gpt2', I get an AttributeError when trying to run get_refined_neurons()
The following snippet would be able to reproduce the error on a colab notebook:
fromknowledge_neuronsimport (
KnowledgeNeurons,
initialize_model_and_tokenizer,
)
# setup model, tokenizer + kn classMODEL_NAME="gpt2"## setting this to "bert-base-uncased" worked, but not on "gpt2"model, tokenizer=initialize_model_and_tokenizer(MODEL_NAME)
kn=KnowledgeNeurons(model, tokenizer)
TEXT="Sarah was visiting [MASK], the capital of france"GROUND_TRUTH="paris"BATCH_SIZE=10STEPS=20ENG_TEXTS= [
"Sarah was visiting [MASK], the capital of france",
"The capital of france is [MASK]",
"[MASK] is the capital of france",
"France's capital [MASK] is a hotspot for romantic vacations",
"The eiffel tower is situated in [MASK]",
"[MASK] is the most populous city in france",
"[MASK], france's capital, is one of the most popular tourist destinations in the world",
]
FRENCH_TEXTS= [
"Sarah visitait [MASK], la capitale de la france",
"La capitale de la france est [MASK]",
"[MASK] est la capitale de la france",
"La capitale de la France [MASK] est un haut lieu des vacances romantiques",
"La tour eiffel est située à [MASK]",
"[MASK] est la ville la plus peuplée de france",
"[MASK], la capitale de la france, est l'une des destinations touristiques les plus prisées au monde",
]
TEXTS=ENG_TEXTS+FRENCH_TEXTSP=0.5# sharing percentagerefined_neurons_eng=kn.get_refined_neurons(
ENG_TEXTS,
GROUND_TRUTH,
p=P,
batch_size=BATCH_SIZE,
steps=STEPS,
)
Given below is the full traceback:
Getting coarse neurons for each prompt...: 0%| | 0/7 [00:00<?, ?it/s]
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-8-1164d97fbee5> in <module>()
4 p=P,
5 batch_size=BATCH_SIZE,
----> 6 steps=STEPS,
7 )
6 frames
/usr/local/lib/python3.7/dist-packages/knowledge_neurons/knowledge_neurons.py in get_refined_neurons(self, prompts, ground_truth, p, batch_size, steps, coarse_adaptive_threshold, coarse_threshold, coarse_percentile, quiet)
340 threshold=coarse_threshold,
341 percentile=coarse_percentile,
--> 342 pbar=False,
343 )
344 )
/usr/local/lib/python3.7/dist-packages/knowledge_neurons/knowledge_neurons.py in get_coarse_neurons(self, prompt, ground_truth, batch_size, steps, threshold, adaptive_threshold, percentile, pbar)
270 """
271 attribution_scores = self.get_scores(
--> 272 prompt, ground_truth, batch_size=batch_size, steps=steps, pbar=pbar
273 )
274 assert sum(e is not None for e in [threshold, adaptive_threshold, percentile]) == 1, f"Provide one and only one of threshold / adaptive_threshold / percentile"
/usr/local/lib/python3.7/dist-packages/knowledge_neurons/knowledge_neurons.py in get_scores(self, prompt, ground_truth, batch_size, steps, pbar)
223 encoded_input = self.tokenizer(prompt, return_tensors="pt").to(self.device)
224 for layer_idx in tqdm(
--> 225 range(self.n_layers()),
226 desc="Getting attribution scores for each layer...",
227 disable=not pbar,
/usr/local/lib/python3.7/dist-packages/knowledge_neurons/knowledge_neurons.py in n_layers(self)
136
137 def n_layers(self):
--> 138 return len(self._get_transformer_layers())
139
140 def intermediate_size(self):
/usr/local/lib/python3.7/dist-packages/knowledge_neurons/knowledge_neurons.py in _get_transformer_layers(self)
69
70 def _get_transformer_layers(self):
---> 71 return get_attributes(self.model, self.transformer_layers_attr)
72
73 def _prepare_inputs(self, prompt, target=None, encoded_input=None):
/usr/local/lib/python3.7/dist-packages/knowledge_neurons/patch.py in get_attributes(x, attributes)
16 """
17 for attr in attributes.split("."):
---> 18 x = getattr(x, attr)
19 return x
20
/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py in __getattr__(self, name)
1176 return modules[name]
1177 raise AttributeError("'{}' object has no attribute '{}'".format(
-> 1178 type(self).__name__, name))
1179
1180 def __setattr__(self, name: str, value: Union[Tensor, 'Module']) -> None:
AttributeError: 'GPT2LMHeadModel' object has no attribute 'bert'
The text was updated successfully, but these errors were encountered:
Thank you for trying out this code! You need to add the argument model_type=model_type(MODEL_NAME) to the KnowledgeNeurons wrapped. The purpose of this is to tell the code what HF Transformer class to use. You can see here that this defaults to "bert" which is why the bert example works.
When I initialize
KnowledgeNeurons
withmodel_name = 'gpt2'
, I get anAttributeError
when trying to runget_refined_neurons()
The following snippet would be able to reproduce the error on a colab notebook:
Given below is the full traceback:
The text was updated successfully, but these errors were encountered: