Skip to content

Misc. bug: The test-chat fails with std::runtime_error #11705

@yurivict

Description

@yurivict

Name and Version

18/32 Testing: test-chat  
18/32 Test: test-chat 
Command: "/usr/ports/misc/llama-cpp/work/.build/bin/test-chat"
Directory: .
"test-chat" start time: Feb 06 01:21 PST              
Output:
----------------------------------------------------------
# Reading: models/templates/CohereForAI-c4ai-command-r-plus-tool_use.jinja
Terminating due to uncaught exception 0x3855a2041960 of type std::runtime_error           
<end of output>
Test time =   0.08 sec
----------------------------------------------------------
Test Failed.
"test-chat" end time: Feb 06 01:21 PST
"test-chat" time elapsed: 00:00:00
----------------------------------------------------------

Could you please catch std::runtime_error and print a meaningful error message?

Operating systems

BSD

Which llama.cpp modules do you know to be affected?

libllama (core library)

Command line

Problem description & steps to reproduce

See above.

First Bad Commit

Found in version: 4649

Relevant log output

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions