Skip to content

Providing explicit guidance on answering questions that there is no information resources about #131

@neuromechanist

Description

@neuromechanist

Currently, both assistants that we have start to hallucinate in 2 separate occasions:

  1. They hallucinate when there are not enough resources to answer.
  2. They also hallucinate on discussion numbers and PR numbers, and make up discussion points from the GitHub database.

A straightforward fix might be to provide explicit instructions not to hallucinate, and if there is no direct information. About the question, provide a statement that I couldn't find direct information, and here are some pointers for further looking.

This is reported by @smakeig

Metadata

Metadata

Assignees

No one assigned

    Labels

    P1Priority 1: Critical, fix as soon as possiblechat-experience

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions