-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BitBuilder] Add session ID to llm request metadata #37
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/bb
@@ -103,5 +104,4 @@ async def start_mongo_client(): | |||
|
|||
|
|||
@app.on_event("shutdown") | |||
async def shutdown_db_client(): | |||
app.mongo_client.close() | |||
async def shutdown_db_client(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
add app.mongo_client.close()
below here
Sorry, BitBuilder encountered an error while addressing comments in this Pull Request. Please try again later. (wflow_hTrxhBkVLD7U4m7d) 🤖 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/bb
class LLMChatCompletionRequest(BaseModel): | ||
model: str | ||
messages: List[LLMChatCompletionMessage] | ||
sessionId: str |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
delete redundant class
Summary:
Issue: #36
Implementation:
/ui/pages/index.tsx
file, modify thehandleNewUserPrompt
function to include thesessionId
in thesendLLMRequest
function call. ThesessionId
is already being retrieved at the start of thePromptPage
function, so it can be passed directly to thesendLLMRequest
function. The modifiedsendLLMRequest
call should look like this:const llmSummary = await sendLLMRequest({ model: 'gpt-3.5-turbo', messages: buildSummarizationPrompt(content, serverResponseMsg.results), sessionId: sessionId })
/ui/shared/api.ts
file, modify thesendLLMRequest
function to acceptsessionId
as a parameter. The modified function should look like this:export async function sendLLMRequest(data: LLMChatCompletionRequest, sessionId: string): Promise<string> {...}
. Also, includesessionId
in the request payload:const response = await axios.post<{text: string}>(
${backendRootUrl}/llm/${sessionId}, data);
/backend/models.py
file, modify theLLMChatCompletionRequest
model to includesessionId
as a field. The modified model should look like this:class LLMChatCompletionRequest(BaseModel): model: str; messages: List[LLMChatCompletionMessage]; sessionId: str
/backend/main.py
file, modify thellm
endpoint to acceptsessionId
as a parameter. The modified endpoint should look like this:@app.post('/llm/{sessionId}')
. Also, modify thellm
function to passsessionId
to thellm_get
function:result = await llm_get(request.model, request.messages, sessionId)
/backend/llm.py
file, modify thellm_get
function to acceptsessionId
as a parameter. The modified function should look like this:async def llm_get(model: str, messages: List[LLMChatCompletionMessage], sessionId: str) -> str
. Also, includesessionId
in the metadata of theacompletion
call:metadata={"environment": getEnvironment(), "sessionId": sessionId}
Plan Feedback: Approved by @nsbradford
Something look wrong?: If this Pull Request doesn't contain the expected changes, add more information to #36. Then, add the
bitbuilder:create
label to try again. For more information, check the documentation.Generated with ❤️ by www.bitbuilder.ai