Detecting AI Hallucinations Before Them Happening
Halgorithem is a Custom Designed Algorithem For Detecting AI Hallucinations without Little to Any AI Present in the Algo Itself. Halgorithem was designed with speed in mind to quickly detect AI Hallucinating
Halgorithem works by Parsing your files and input into a Tree which is compared with file chunks which were made into trees. If something doesn't make sense, Halgorithem Flags it.
-
🔗 Fits Into Any AI workflow where responses are gened
Halgorithem can be integrated into AI Pipelines designed in python like LangGraph, CrewAI, PydanticAI and Microsoft AutoGen -
First Robust Solution to AI Hallucination
Halgorithem is the first robust and true solution to hallucination detection over the alternitives.
| Topic | Sources | Supported | Weak | Contradictions | Hallucinations |
|---|---|---|---|---|---|
| Microsoft / Satya Nadella | 5 Wikipedia pages | 3/4 | 1/4 | 0 | 0 |
| James Webb Space Telescope | 3 Wikipedia pages | 5/6 | 1/6 | 0 | 1* |
| Apple / Tim Cook | 3 Wikipedia pages | 3/3 | 0/3 | 0 | 0 |
| Elon Musk / Twitter | 4 Wikipedia pages | 2/2 | 0/2 | 0 | 0 |
*JWST $10B cost not present in scraped source text — UNVERIFIABLE, not hallucination
To run Halgorithem, follow these steps:
-
Create a virtual environment:
python -m venv venv -
Activate the virtual environment:
source venv/bin/activate -
Install the required modules:
pip install -r requirements.txt -
Download the spaCy English model (if it is not installed automatically):
python -m spacy download en_core_web_sm -
Run the benchmark:
python bench.py


