The full pipeline of creating UHGEval hallucination dataset
-
Updated
Feb 15, 2024 - Python
The full pipeline of creating UHGEval hallucination dataset
Antibodies for LLMs hallucinations (grouping LLM as a judge, NLI, reward models)
Code for PARENTing via Model-Agnostic Reinforcement Learning to Correct Pathological Behaviors in Data-to-Text Generation (Rebuffel, Soulier, Scoutheeten, Gallinari; INLG 2020)
[ICML 2024] Official implementation for "HALC: Object Hallucination Reduction via Adaptive Focal-Contrast Decoding"
The implementation for EMNLP 2023 paper ”Beyond Factuality: A Comprehensive Evaluation of Large Language Models as Knowledge Generators“
A PyTorch implementation of the paper Thinking Hallucination for Video Captioning.
[ECCV 2024] Official PyTorch implementation of ESREAL
mPLUG-HalOwl: Multimodal Hallucination Evaluation and Mitigating
Code for Controlling Hallucinations at Word Level in Data-to-Text Generation (C. Rebuffel, M. Roberti, L. Soulier, G. Scoutheeten, R. Cancelliere, P. Gallinari)
An Easy-to-use Hallucination Detection Framework for LLMs.
DCR-Consistency: Divide-Conquer-Reasoning for Consistency Evaluation and Improvement of Large Language Models
Codes related to the paper "On hallucinations in tomographic imaging"
Code for ACL 2024 paper "TruthX: Alleviating Hallucinations by Editing Large Language Models in Truthful Space"
Code for the EMNLP 2024 paper "Detecting and Mitigating Contextual Hallucinations in Large Language Models Using Only Attention Maps"
Attack to induce LLMs within hallucinations
[ACL 2024] User-friendly evaluation framework: Eval Suite & Benchmarks: UHGEval, HaluEval, HalluQA, etc.
✨✨Woodpecker: Hallucination Correction for Multimodal Large Language Models. The first work to correct hallucinations in MLLMs.
Alignment toolkit to safeguards LLMs
Leaderboard Comparing LLM Performance at Producing Hallucinations when Summarizing Short Documents
Official implementation for the paper "DoLa: Decoding by Contrasting Layers Improves Factuality in Large Language Models"
Add a description, image, and links to the hallucinations topic page so that developers can more easily learn about it.
To associate your repository with the hallucinations topic, visit your repo's landing page and select "manage topics."