-
Notifications
You must be signed in to change notification settings - Fork 14.5k
Closed
Labels
Description
Name and Version
18/32 Testing: test-chat
18/32 Test: test-chat
Command: "/usr/ports/misc/llama-cpp/work/.build/bin/test-chat"
Directory: .
"test-chat" start time: Feb 06 01:21 PST
Output:
----------------------------------------------------------
# Reading: models/templates/CohereForAI-c4ai-command-r-plus-tool_use.jinja
Terminating due to uncaught exception 0x3855a2041960 of type std::runtime_error
<end of output>
Test time = 0.08 sec
----------------------------------------------------------
Test Failed.
"test-chat" end time: Feb 06 01:21 PST
"test-chat" time elapsed: 00:00:00
----------------------------------------------------------
Could you please catch std::runtime_error and print a meaningful error message?
Operating systems
BSD
Which llama.cpp modules do you know to be affected?
libllama (core library)
Command line
Problem description & steps to reproduce
See above.
First Bad Commit
Found in version: 4649