Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BitBuilder] Add session ID to llm request metadata #37

Closed
wants to merge 1 commit into from

Conversation

ellipsis-dev[bot]
Copy link

@ellipsis-dev ellipsis-dev bot commented Sep 26, 2023

Summary:

Issue: #36

Implementation:

  1. Add sessionId to the frontend
    • In the /ui/pages/index.tsx file, modify the handleNewUserPrompt function to include the sessionId in the sendLLMRequest function call. The sessionId is already being retrieved at the start of the PromptPage function, so it can be passed directly to the sendLLMRequest function. The modified sendLLMRequest call should look like this: const llmSummary = await sendLLMRequest({ model: 'gpt-3.5-turbo', messages: buildSummarizationPrompt(content, serverResponseMsg.results), sessionId: sessionId })
  2. Modify sendLLMRequest function to accept sessionId
    • In the /ui/shared/api.ts file, modify the sendLLMRequest function to accept sessionId as a parameter. The modified function should look like this: export async function sendLLMRequest(data: LLMChatCompletionRequest, sessionId: string): Promise<string> {...}. Also, include sessionId in the request payload: const response = await axios.post<{text: string}>(${backendRootUrl}/llm/${sessionId}, data);
  3. Modify LLMChatCompletionRequest model to include sessionId
    • In the /backend/models.py file, modify the LLMChatCompletionRequest model to include sessionId as a field. The modified model should look like this: class LLMChatCompletionRequest(BaseModel): model: str; messages: List[LLMChatCompletionMessage]; sessionId: str
  4. Modify llm endpoint to accept sessionId
    • In the /backend/main.py file, modify the llm endpoint to accept sessionId as a parameter. The modified endpoint should look like this: @app.post('/llm/{sessionId}'). Also, modify the llm function to pass sessionId to the llm_get function: result = await llm_get(request.model, request.messages, sessionId)
  5. Modify llm_get function to accept sessionId
    • In the /backend/llm.py file, modify the llm_get function to accept sessionId as a parameter. The modified function should look like this: async def llm_get(model: str, messages: List[LLMChatCompletionMessage], sessionId: str) -> str. Also, include sessionId in the metadata of the acompletion call: metadata={"environment": getEnvironment(), "sessionId": sessionId}

Plan Feedback: Approved by @nsbradford

Something look wrong?: If this Pull Request doesn't contain the expected changes, add more information to #36. Then, add the bitbuilder:create label to try again. For more information, check the documentation.

Generated with ❤️ by www.bitbuilder.ai

@vercel
Copy link

vercel bot commented Sep 26, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
semantic-search-mini ❌ Failed (Inspect) Sep 26, 2023 8:03pm

Copy link
Owner

@nsbradford nsbradford left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/bb

@@ -103,5 +104,4 @@ async def start_mongo_client():


@app.on_event("shutdown")
async def shutdown_db_client():
app.mongo_client.close()
async def shutdown_db_client():
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add app.mongo_client.close() below here

@ellipsis-dev
Copy link
Author

ellipsis-dev bot commented Sep 26, 2023

Sorry, BitBuilder encountered an error while addressing comments in this Pull Request. Please try again later. (wflow_hTrxhBkVLD7U4m7d) 🤖

Copy link
Owner

@nsbradford nsbradford left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/bb

class LLMChatCompletionRequest(BaseModel):
model: str
messages: List[LLMChatCompletionMessage]
sessionId: str
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

delete redundant class

@nsbradford nsbradford added bitbuilder:review bug Something isn't working question Further information is requested and removed bug Something isn't working question Further information is requested bitbuilder:review labels Oct 26, 2023
@nsbradford nsbradford closed this Jun 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant