-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
community[minor]: Integration for Friendli
LLM and ChatFriendli
ChatModel.
#5004
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
@@ -0,0 +1,47 @@ | |||
import { ChatFriendli } from "@langchain/community/chat_models/friendli"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey there! I've reviewed the code and flagged a change for you to review. The added lines in the diff explicitly access environment variables using process.env
, so it's important to ensure that this is handled appropriately. Let me know if you need further assistance with this.
@@ -0,0 +1,42 @@ | |||
import { Friendli } from "@langchain/community/llms/friendli"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey there! 👋 This is a friendly flag to bring attention to the addition of explicit environment variable access using process.env
in the friendli.ts
file. It's important for maintainers to review this change for potential security and configuration considerations. Thank you! 🚀
langchain/package.json
Outdated
@@ -3212,6 +3229,15 @@ | |||
"import": "./chat_models/fireworks.js", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey there! 👋 I noticed the addition of a new entry for "./chat_models/friendli" in the package.json file. This change seems to be related to internal module dependencies, and I've flagged it for the maintainers to review. Keep up the great work! 🚀
@@ -0,0 +1,378 @@ | |||
import { CallbackManagerForLLMRun } from "@langchain/core/callbacks/manager"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey team, I've reviewed the code and noticed that the new changes introduce a net-new HTTP request using the fetch
API to the Friendli inference endpoint. This comment is to flag the change for maintainers to review. Let me know if you have any questions or need further clarification.
@@ -0,0 +1,378 @@ | |||
import { CallbackManagerForLLMRun } from "@langchain/core/callbacks/manager"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey team, I've reviewed the code and flagged a specific change related to environment variable usage for your attention. The code appears to explicitly access environment variables using the getEnvironmentVariable
function, so it's important to ensure that the handling of these variables aligns with best practices.
@@ -0,0 +1,236 @@ | |||
import { CallbackManagerForLLMRun } from "@langchain/core/callbacks/manager"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey team, I've reviewed the code and noticed that the new changes introduce net-new HTTP requests using the fetch
function. This comment is to flag the change for maintainers to review and ensure it aligns with the project's requirements and standards. Great work on the implementation!
@@ -0,0 +1,236 @@ | |||
import { CallbackManagerForLLMRun } from "@langchain/core/callbacks/manager"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey there! I noticed that the recent code changes explicitly access environment variables using the getEnvironmentVariable
function. I've flagged this for your review to ensure that the handling of environment variables aligns with best practices. Let me know if you need further assistance with this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice job @seuha516. I have a few comments and questions before I approve. Please take a look. Thank you!
My suggestions aren't blocking, but just wanted to share them with you.
/* #__PURE__ */ logVersion010MigrationWarning({ | ||
oldEntrypointName: "chat_models/friendli", | ||
}); | ||
export * from "@langchain/community/chat_models/friendli"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@jacoblee93, do we need to include this for newly added models moving forward? Why not exclude?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, we should remove as we are planning to completely remove @langchain/community
from langchain
in the near future.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I removed langchain/src/llms/friendli.ts
and langchain/src/chat_models/friendli.ts
files.
for await (const chunk of stream) { | ||
const parsedChunk = JSON.parse(chunk) as FriendliResponse; | ||
|
||
if (parsedChunk.event !== "complete") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggestion: The complete
event can also be processed so that generation metadata is returned.
Example (something like this):
if (parsedChunk.event !== "complete") {
...
} else {
yield new GenerationChunk({
text: "",
generationInfo: {
promptTokens: parsedChunk.prompt_tokens,
completionTokens: parsedChunk.completion_tokens,
totalTokens: parsedChunk.total_tokens,
},
});
}
I'm looking at this documentation: https://docs.friendli.ai/openapi/create-completions
The same can be implemented in ChatFriendli
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated Friendli
to return generation metadata when event is completed.
method: "POST", | ||
headers: { | ||
"Content-Type": "application/json", | ||
Accept: "application/json", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This needs to be Accept: "text/event-stream"
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed to Accept: "text/event-stream"
.
// eslint-disable-next-line @typescript-eslint/no-explicit-any | ||
delta: Record<string, any> | ||
) { | ||
const { role } = delta; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is probably a bug in the Friendli API docs, but the example response shows that role
isn't always returned in the streaming response. If this is true, then this will cause an error.
I'm looking at this documentation: https://docs.friendli.ai/openapi/create-chat-completions#responses
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I added defaultValue "assistant" for role
.
@@ -311,6 +314,7 @@ export const config = { | |||
"vectorstores/zep", | |||
"chat_models/bedrock", | |||
"chat_models/bedrock/web", | |||
"chat_models/friendli", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You don't need to add your models here since they don't use any peer dependencies.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I removed "llms/friendli" and "chat_models/friendli" from requiresOptionalDependency
.
Thank you for the detailed and kind comments! I made the requested changes. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice job with all the of updates! @seuha516
@jacoblee93, are any of the changes under /langchain
needed? The original chat_model
and llm
classes are removed now. Besides that, everything looks good to me.
Ah yes can you revert the changes there @seuha516? You fork doesn't seem to allow maintainer edits. Otherwise LGTM! |
@jacoblee93 I reverted changes under |
Thank you! |
Description
Friendli
LLM andChatFriendli
chat model.Friendli
LLM andChatFriendli
ChatModel. langchain#17913.Twitter handle