Skip to content

Commit

Permalink
Fix Locate GPT-2 Knowledge tutorial in docs (#174)
Browse files Browse the repository at this point in the history
  • Loading branch information
gsarti committed Apr 24, 2023
1 parent a4a43e2 commit 530ffb9
Showing 1 changed file with 4 additions and 4 deletions.
8 changes: 4 additions & 4 deletions docs/source/examples/locate_gpt2_knowledge.rst
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ saving aggregated outputs to disk.
)
for i, ex in data:
# e.g. "The capital of Spain is"
prompt = ex["relation"].format{ex["subject"]}
prompt = ex["relation"].format(ex["subject"])
# e.g. "The capital of Spain is Madrid"
true_answer = prompt + ex["target_true"]
# e.g. "The capital of Spain is Paris"
Expand All @@ -56,15 +56,15 @@ saving aggregated outputs to disk.
out = attrib_model.attribute(
prompt,
true_answer,
attributed_fn="contrast_logits_diff",
attributed_fn="contrast_prob_diff",
contrast_ids=contrast.input_ids,
contrast_attention_mask=contrast.attention_mask,
step_scores=["contrast_logits_diff"],
step_scores=["contrast_prob_diff"],
show_progress=False,
)
# Save aggregated attributions to disk
out = out.aggregate()
out.save(f"layer_{l}_ex_{i}.json", overwrite=True)
out.save(f"layer_{layer}_ex_{i}.json", overwrite=True)
The following plots visualize attributions per layers for some examples taken from the dataset, showing how
intermediate layers play a relevant role in recalling factual knowledge, in relation to the last subject token in the
Expand Down

0 comments on commit 530ffb9

Please sign in to comment.