Currently, both assistants that we have start to hallucinate in 2 separate occasions:
- They hallucinate when there are not enough resources to answer.
- They also hallucinate on discussion numbers and PR numbers, and make up discussion points from the GitHub database.
A straightforward fix might be to provide explicit instructions not to hallucinate, and if there is no direct information. About the question, provide a statement that I couldn't find direct information, and here are some pointers for further looking.
This is reported by @smakeig