Skip to content

Bug: llama-perplexity error using multiple-choice binary data #9316

@fedric95

Description

@fedric95

What happened?

"The multiple choice evaluation has been broken in llama.cpp via commit 6ff1398.

The multiple choice evaluation uses binary data stored in params.prompt. Commit 6ff1398 adds prompt escape character processing, which modifies the binary data and renders it unusable. To preserve whatever utility 6ff1398 might have added, we add a flag indicating if the data stored in params.prompt is binary and, if so, avoid the escape processing." @ikawrakow

@ikawrakow solved the problem in his llama.cpp fork in the following PR: ikawrakow/ik_llama.cpp#33

Name and Version

I tested the issue with the docker release of llama.cpp:

ghcr.io/ggerganov/llama.cpp:full-cuda--b1-98a532d

What operating system are you seeing the problem on?

Linux

Relevant log output

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingmedium severityUsed to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions