Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add chat with models #2933

Merged
merged 7 commits into from
Aug 6, 2024
Merged

feat: add chat with models #2933

merged 7 commits into from
Aug 6, 2024

Conversation

StanGirard
Copy link
Collaborator

Description

Please include a summary of the changes and the related issue. Please also include relevant motivation and context.

Checklist before requesting a review

Please delete options that are not relevant.

  • My code follows the style guidelines of this project
  • I have performed a self-review of my code
  • I have commented hard-to-understand areas
  • I have ideally added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes
  • Any dependent changes have been merged

Screenshots (if appropriate):

Copy link

vercel bot commented Jul 31, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
quivrapp ✅ Ready (Inspect) Visit Preview 💬 Add feedback Aug 6, 2024 0:50am

…de/settings.json

Update the editor.defaultFormatter setting in .vscode/settings.json to "charliermarsh.ruff" to improve code formatting for Python files. This change will ensure consistent code style across the project.
The code changes in `chat_llm_service.py` and `chat_llm.py` add functionality to save the answer for a chat with a model. This is achieved by updating the `save_answer` method in the `ChatLLMService` class and the `last_chunk` variable in the `ChatLLM` class. The changes include logging the chat ID and model name, as well as handling the last chunk of the response.
The code changes in `chat_llm.py` modify the `ChatLLM` class to filter the chat history and include only the messages that are relevant to the current question. This change improves the accuracy of the chat responses.


@pytest.mark.base
def test_chat_llm(fake_llm):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🤟

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to commit this file ?

Copy link
Collaborator

@AmineDiro AmineDiro left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very small changes. Nice PR

Co-authored-by: AmineDiro <aminedirhoussi1@gmail.com>
@StanGirard StanGirard marked this pull request as ready for review August 6, 2024 12:47
@dosubot dosubot bot added size:XL This PR changes 500-999 lines, ignoring generated files. area: backend Related to backend functionality or under the /backend directory labels Aug 6, 2024
@StanGirard StanGirard merged commit fccd197 into main Aug 6, 2024
7 checks passed
@StanGirard StanGirard deleted the feat/model-chatting branch August 6, 2024 12:51
StanGirard added a commit that referenced this pull request Aug 7, 2024
🤖 I have created a release *beep* *boop*
---


## 0.0.294 (2024-08-07)

## What's Changed
* Delete Porter Application quivr-com by @porter-deployment-app in
#2927
* Delete Porter Application quivr-com-backend by @porter-deployment-app
in #2928
* feat: quivr core tox test + parsers by @AmineDiro in
#2929
* feat(frontend): handle no brain selection by @Zewed in
#2932
* fix: processor quivr version by @AmineDiro in
#2934
* fix: quivr core fix tests by @AmineDiro in
#2935
* chore(main): release core 0.0.13 by @StanGirard in
#2930
* feat: Add GitHub sync functionality to sync router by @chloedia in
#2871
* refactor: Remove syncGitHub function from useSync.ts by @StanGirard in
#2942
* feat: add chat with models by @StanGirard in
#2933
* ci: precommit in CI by @AmineDiro in
#2946
* feat: Add get_model method to ModelRepository by @StanGirard in
#2949
* feat: Add user email to StripePricingOrManageButton and
UpgradeToPlusButton components by @StanGirard in
#2951


**Full Changelog**:
v0.0.293...v0.0.294

---
This PR was generated with [Release
Please](https://github.com/googleapis/release-please). See
[documentation](https://github.com/googleapis/release-please#release-please).
StanGirard added a commit that referenced this pull request Sep 10, 2024
🤖 I have created a release *beep* *boop*
---


##
[0.0.14](core-0.0.13...core-0.0.14)
(2024-09-09)


### Features

* Add brain_id and brain_name to ChatLLMMetadata model
([#2968](#2968))
([1112001](1112001))
* add chat with models
([#2933](#2933))
([fccd197](fccd197))
* Add get_model method to ModelRepository
([#2949](#2949))
([13e9fc4](13e9fc4))
* **anthropic:** add llm
([#3146](#3146))
([8e29218](8e29218))
* **azure:** quivr compatible with it
([#3005](#3005))
([b5f31a8](b5f31a8))
* **frontend:** talk with models and handle code markdown
([#2980](#2980))
([ef6037e](ef6037e))
* quivr core 0.1 ([#2970](#2970))
([380cf82](380cf82))
* using langgraph in our RAG pipeline
([#3130](#3130))
([8cfdf53](8cfdf53))


### Bug Fixes

* **chat:** order of chat history was reversed
([#3148](#3148))
([7209500](7209500))

---
This PR was generated with [Release
Please](https://github.com/googleapis/release-please). See
[documentation](https://github.com/googleapis/release-please#release-please).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area: backend Related to backend functionality or under the /backend directory size:XL This PR changes 500-999 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants