-
-
Notifications
You must be signed in to change notification settings - Fork 176
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support prefill #463
Comments
I like the term "prefill" for this. I think it's a CLI option: llm -m claude-3-opus 'JSON list of US state names' --prefill '["' And a Python argument: model = llm.get_model("claude-3-opus")
response = model.prompt("JSON list of US state names", prefill='["') |
Probably needs a Lines 243 to 248 in 9ad9ac6
Or should I call that The image branch is currently using Lines 255 to 261 in eaf50d8
|
I'm tempted to switch I also want a |
Another thought @simonw -- I think there's 4 possibilities for API support:
So instead of a |
In an interesting twist... some of the OpenAI models apparently support this too! https://twitter.com/HamelHusain/status/1782149471624888512
But... it looks like they are a little bit inconsistent about whether they continue the prompt without the prefill or if they answer with the prefill included: https://twitter.com/HamelHusain/status/1782154898102211053
|
The |
This is what we decided to do for OpenAI. I think it is a nice approach actually. |
Claude 3 and other models (like Reka) support prefill, where you can construct a chat but set the first tokens of the model's reply. I use that in
datasette-query-assistant
here: https://github.com/datasette/datasette-query-assistant/blob/a777a80bcb3b42933b2933de895f4f2eb9376e9d/datasette_query_assistant/__init__.py#L52-L62LLM should offer this at both the CLI level and the Python API level.
The text was updated successfully, but these errors were encountered: