Explore concepts like Self-Correct, Self-Refine, Self-Improve, Self-Contradict, Self-Play, and Self-Knowledge, alongside o1-like reasoning elevation🍓 and hallucination alleviation🍄.
-
Updated
Dec 7, 2024 - Jupyter Notebook
Explore concepts like Self-Correct, Self-Refine, Self-Improve, Self-Contradict, Self-Play, and Self-Knowledge, alongside o1-like reasoning elevation🍓 and hallucination alleviation🍄.
Official Pytorch implementation of (Roles and Utilization of Attention Heads in Transformer-based Neural Language Models), ACL 2020
[ACL 2025] Does Time Have Its Place? Temporal Heads: Where Language Models Recall Time-specific Information
Self-Evolving Python Transformer Research
Self-Evolving Python Transformer Research
Add a description, image, and links to the attention-head topic page so that developers can more easily learn about it.
To associate your repository with the attention-head topic, visit your repo's landing page and select "manage topics."