-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
♻️ Simplify setup, prepare for multi-models #156
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks like i was too slow:)
@@ -6,6 +6,7 @@ import { | |||
PUBLIC_DEPRECATED_GOOGLE_ANALYTICS_ID, | |||
} from "$env/static/public"; | |||
import { addYears } from "date-fns"; | |||
import { inspect } from "node:util"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
import { inspect } from "node:util"; |
@@ -2,4 +2,6 @@ export interface Message { | |||
from: "user" | "assistant"; | |||
id: ReturnType<typeof crypto.randomUUID>; | |||
content: string; | |||
// Only for "assistant" messages | |||
model?: string; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IMO this should instead be on Conversation
(and always be defined, i.e. we can migrate current DB)
Otherwise how do you export the conv data to a model author?
model?: string; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@gary149 wanted to be able to switch models during a conversation
Otherwise how do you export the conv data to a model author?
Since we send the whole exchange everytime to the model during the inference, it makes sense to send to the model author the whole conversation up to the last response by their own model
@@ -30,10 +31,14 @@ export async function POST({ request, fetch, locals, params }) { | |||
const json = await request.json(); | |||
const { | |||
inputs: newPrompt, | |||
model, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
or pass a conv id instead of a model here, and get the model from the DB
In preparation to open-sourcing the repo, simply the config needed:
Incidentally, this refacto helps prepare for multi-model
Note: this loses the distinction betweenMODEL_NAME
andMODEL_ID
for the sake of configuration simplicity (and I think that's for the better, but can be reverted)(don't forget to update your .env.local to test with prod model if you want - otherwise just setting
HF_ACCESS_TOKEN
in.env.local
is enough)With default configuration: