Skip to content

Conversation

LongStoryMedia
Copy link

Updated to support Llama.cpp tags/b6490

fixes kv_cache errors

lays groundwork for more future proof, raw, passthrough class which will interface more closely with llama.cpp

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant