Skip to content

Error: Unknown model for all Groq models in ChatGroq #5364

Closed as not planned
Closed as not planned
@abishekdevendran

Description

@abishekdevendran

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain.js documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain.js rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

import { ChatGroq } from "@langchain/groq";
const large_model = new ChatGroq({
    temperature: 0.0
});

import { ConversationSummaryBufferMemory } from "langchain/memory";
import db from "./db/sqlite";
import { small_model } from "./llms";
import {
    BaseMessage,
    StoredMessage,
    mapChatMessagesToStoredMessages,
    mapStoredMessagesToChatMessages,
} from "@langchain/core/messages";
import { BaseListChatMessageHistory } from "@langchain/core/chat_history";

// Not required, but usually chat message histories will handle multiple sessions
// for different users, and should take some kind of sessionId as input.
export interface CustomChatMessageHistoryInput {
    sessionId: string;
}

export class CustomChatMessageHistory extends BaseListChatMessageHistory {
    lc_namespace = ["langchain", "stores", "message"];

    sessionId: string;

    constructor(fields: CustomChatMessageHistoryInput) {
        super(fields);
        this.sessionId = fields.sessionId;
    }

    async getMessages(): Promise<BaseMessage[]> {
        const messagesStr = (db.prepare("SELECT data FROM chat WHERE sessionId = ?").get(this.sessionId) as {
            data: string;
        })?.data ?? "[]";
        let messages: StoredMessage[] = JSON.parse(messagesStr);
        return mapStoredMessagesToChatMessages(messages);
    }

    async addMessage(message: BaseMessage): Promise<void> {
        const retrievedMessagesStr = (db.prepare("SELECT data FROM chat WHERE sessionId = ?").get(this.sessionId) as {
            data: string;
        })?.data ?? "[]";
        let retrievedMessages: StoredMessage[] = JSON.parse(retrievedMessagesStr);
        const serializedMessages = mapChatMessagesToStoredMessages([message]);
        retrievedMessages.push(serializedMessages[0]);
        // console.log("retrievedMessage: ", retrievedMessages);
        db.prepare("INSERT OR REPLACE INTO chat (sessionId, data) VALUES (?, ?)").run(this.sessionId, JSON.stringify(retrievedMessages));
    }

    async addMessages(messages: BaseMessage[]): Promise<void> {
        const retrievedMessagesStr = (db.prepare("SELECT data FROM chat WHERE sessionId = ?").get(this.sessionId) as {
            data: string;
        })?.data ?? "[]";
        let retrievedMessages: StoredMessage[] = JSON.parse(retrievedMessagesStr);
        const serializedMessages = mapChatMessagesToStoredMessages(messages);
        retrievedMessages.push(...serializedMessages);
        // console.log("retrievedMessagessss: ", retrievedMessages);
        db.prepare("INSERT OR REPLACE INTO chat (sessionId, data) VALUES (?, ?)").run(this.sessionId, JSON.stringify(retrievedMessages));
    }

    async clear(): Promise<void> {
        db.prepare("DELETE FROM chat WHERE sessionId = ?").run(this.sessionId);
    }
}

export function getMemory(sessionId: string): ConversationSummaryBufferMemory {
    const history = new CustomChatMessageHistory({ sessionId: sessionId });
    return new ConversationSummaryBufferMemory({
        chatHistory: history,
        memoryKey: "chat_history",
        llm: small_model,
    });
}

memory.saveContext({ input: query }, { output: resp.response })

Error Message and Stack Trace (if applicable)

Failed to calculate number of tokens, falling back to approximate count Error: Unknown model
    at getEncodingNameForModel (file:///C:/Github/fc-monolith/hono/node_modules/.pnpm/js-tiktoken@1.0.12/node_modules/js-tiktoken/dist/chunk-PEBACC3C.js:230:13)
    at encodingForModel (file:///C:/Github/fc-monolith/hono/node_modules/.pnpm/@langchain+core@0.1.63_openai@4.46.1/node_modules/@langchain/core/dist/utils/tiktoken.js:19:24)
    at ChatGroq.getNumTokens (file:///C:/Github/fc-monolith/hono/node_modules/.pnpm/@langchain+core@0.1.63_openai@4.46.1/node_modules/@langchain/core/dist/language_models/base.js:177:40)
    at ConversationSummaryBufferMemory.prune (file:///C:/Github/fc-monolith/hono/node_modules/.pnpm/langchain@0.1.37_@xenova+transformers@2.17.1_@zilliz+milvus2-sdk-node@2.4.2_better-sqlite3@9._byv6wm32l75bjvaef4ttqhmd3e/node_modules/langchain/dist/memory/summary_buffer.js:114:47)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async ConversationSummaryBufferMemory.saveContext (file:///C:/Github/fc-monolith/hono/node_modules/.pnpm/langchain@0.1.37_@xenova+transformers@2.17.1_@zilliz+milvus2-sdk-node@2.4.2_better-sqlite3@9._byv6wm32l75bjvaef4ttqhmd3e/node_modules/langchain/dist/memory/summary_buffer.js:96:9)

Description

I'm trying to summarize chat history and save using ConversationalSummaryBufferMemory, and when I switch out LLMs from ChatOpenAI or ChatOllama to ChatGroq, I get the error.

System Info

langchain@0.1.37 | MIT | deps: 18 | versions: 270
Typescript bindings for langchain
https://github.com/langchain-ai/langchainjs/tree/main/langchain/

keywords: llm, ai, gpt3, chain, prompt, prompt engineering, chatgpt, machine learning, ml, openai, embeddings, vectorstores

dist
.tarball: https://registry.npmjs.org/langchain/-/langchain-0.1.37.tgz
.shasum: 15db8ca5c24afc39e61773cab69e216dfb38e1bb
.integrity: sha512-rpaLEJtRrLYhAViEp7/aHfSkxbgSqHJ5n10tXv3o4kHP/wOin85RpTgewwvGjEaKc3797jOg+sLSk6a7e0UlMg==
.unpackedSize: 4.0 MB

dependencies:
@anthropic-ai/sdk: ^0.9.1 js-tiktoken: ^1.0.7 openapi-types: ^12.1.3
@langchain/community: ~0.0.47 js-yaml: ^4.1.0 p-retry: 4
@langchain/core: ~0.1.60 jsonpointer: ^5.0.1 uuid: ^9.0.0
@langchain/openai: ~0.0.28 langchainhub: ~0.0.8 yaml: ^2.2.1
@langchain/textsplitters: ~0.0.0 langsmith: ~0.1.7 zod-to-json-schema: ^3.22.3
binary-extensions: ^2.2.0 ml-distance: ^4.0.0 zod: ^3.22.4

maintainers:

dist-tags:
latest: 0.1.37 next: 0.2.0-rc.1

published 3 days ago by basproul braceasproul@gmail.com

@langchain/groq@0.0.9 | MIT | deps: 5 | versions: 9
Groq integration for LangChain.js
https://github.com/langchain-ai/langchainjs/tree/main/libs/langchain-groq/

dist
.tarball: https://registry.npmjs.org/@langchain/groq/-/groq-0.0.9.tgz
.shasum: a44b19af3b784f324057bfb0217ff0613d148c2f
.integrity: sha512-/QGGgazYdxlN8FCmPfEVDO9Hg55POvQdnoou+b3lsugmwP1TYPRtqLW6JY7Atb36X4vjEJwiMCnntDXdT7vgaw==
.unpackedSize: 48.8 kB

dependencies:
@langchain/core: ~0.1.56 groq-sdk: ^0.3.2 zod: ^3.22.4
@langchain/openai: ~0.0.28 zod-to-json-schema: ^3.22.5

maintainers:

dist-tags:
latest: 0.0.9

published 2 weeks ago by basproul braceasproul@gmail.com

node --version: v20.11.0
Platform: Windows 11 Version 10.0.22631 Build 22631 x64

Metadata

Metadata

Assignees

Labels

auto:bugRelated to a bug, vulnerability, unexpected error with an existing feature

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions