Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integration with Microsoft Word by selecting both workspace and thread #3428

Closed
GPTLocalhost opened this issue Mar 9, 2025 · 1 comment
Closed

Comments

@GPTLocalhost
Copy link

Are you not unable to use the workspace/thread endpoints? Those can CRUD threads that you can then manage if you nee those threads to show up in the UI as well. If you just want "threading" but without having to manage threads you can just as easily use the sessionId param in the workspace/*/chat endpoints. Obviously not OpenAI compatible.

Since OpenAI compatibility will break if we start adding custom fields, there isn't much wiggle room there as some SDKs will throw an error if our request/response body is not 100% comforming

Thanks for your helpful guidance.

use the sessionId param in the workspace/*/chat endpoints. Obviously not OpenAI compatible

If so, we plan to migrate OpenAI compatible code and parameters to the "workspace/*/chat" endpoints. In this case, is there any sample code you can advise? We are working on integrating AnythingLLM with Microsoft Word locally like this:

https://youtu.be/-Br_iDDVJBY

The thread is always the "default" one when using OpenAI-compatible endpoints. It will be great to have a deeper integration for selecting both workspace and thread. This is the reason why asking. Thanks again.

Originally posted by @timothycarambat in #3423

@shatfield4
Copy link
Collaborator

You should be able to access the swagger documentation and this would be the endpoint you would want to use http://localhost:3001/api/docs/#/Workspaces/post_v1_workspace__slug__stream_chat for streaming chats to a specific workspace.

The documentation will tell you what response you can expect back from the API and your code will just need to handle the stream chunks of data.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants