You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
https://github.com/guidance-ai/guidance seems to be a easy and efficient way to generate tightly controlled output (like e.g json). Is there a way to use it with models provided by ollama?
The text was updated successfully, but these errors were encountered:
llama.cpp has support for a grammar during inference ggerganov/llama.cpp#1773. Guidance is essentially this, but perhaps implementing this would be in another issue.
A few weeks ago we added format: json via the api and the cli. This allows for specifying that the output must be well formed json and allows you to specify the schema to be used. It seems to cover all the aspects of the issue, so I will go ahead and close it now. If you think there is anything we left out, reopen and we can address. Thanks for being part of this great community.
https://github.com/guidance-ai/guidance seems to be a easy and efficient way to generate tightly controlled output (like e.g json). Is there a way to use it with models provided by ollama?
The text was updated successfully, but these errors were encountered: