Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

能否增加新的AI接口 #50

Open
ChenJunzhe opened this issue Nov 21, 2023 · 1 comment
Open

能否增加新的AI接口 #50

ChenJunzhe opened this issue Nov 21, 2023 · 1 comment

Comments

@ChenJunzhe
Copy link

claude已经不免费了,
chatgpt经常连不上

@ShanJianSoda
Copy link

ShanJianSoda commented Jul 3, 2024

我模仿着写了一个Kimi的,但是报错了

……
[2024-07-03 11:30:29,567][run_on_private_msg_0/ERROR] PyCqBot: 'OpenAI' object has no attribute 'get_num_tokens_from_messages'

这个是模仿GPT.py 的

from waifu.llm.Brain import Brain
from waifu.llm.VectorDB import VectorDB
from waifu.llm.SentenceTransformer import STEmbedding
from langchain.chat_models import ChatOpenAI
from langchain.embeddings import OpenAIEmbeddings
from typing import Any, List, Mapping, Optional
from langchain.schema import BaseMessage
from openai import OpenAI


class Kimi(Brain):
    def __init__(self,
                 api_key: str,
                 name: str
                 ):
        self.llm = OpenAI(
            api_key=api_key,
            base_url="https://api.moonshot.cn/v1",
        )
        # self.llm_nonstream = ChatOpenAI(openai_api_key=api_key, model_name=model)
        # self.embedding = OpenAIEmbeddings(openai_api_key=api_key)
        self.embedding = STEmbedding()
        self.vectordb = VectorDB(self.embedding, f'./memory/{name}.csv')

    def think(self, messages: List[BaseMessage]):
        # messages -> history
        history = []
        for message in messages:
            history.append(
                {'role': 'assistant' if message.type in ['ai', 'chat'] else 'user', "content": message.content})

        completion = self.llm.completions.create(
            model="moonshot-v1-8k",
            messages=history,
            temperature=0.3,
        )
        result = completion.choices[0].message.content
        return result

    def think_nonstream(self, messages: List[BaseMessage]):
        history = []
        for message in messages:
            history.append(
                {'role': 'assistant' if message.type in ['ai', 'chat'] else 'user', "content": message.content})

        completion = self.llm.completions.create(
            model="moonshot-v1-8k",
            messages=history,
            temperature=0.3,
        )
        result = completion.choices[0].message.content
        return result

    def store_memory(self, text: str | list):
        '''保存记忆 embedding'''
        self.vectordb.store(text)

    def extract_memory(self, text: str, top_n: int = 10):
        '''提取 top_n 条相关记忆'''
        return self.vectordb.query(text, top_n)


def main():
    msg = [BaseMessage()]

    llm = Kimi('sk-djaZQbBOnEuyLS31cr9HvXvcbyfxcRDStgLP8hKN4g6Mz9sN', 'ARONA')
    print(llm.think(msg))


if __name__ == "__main__":
    main()

如果作者有空,能否写一个Kimi的接口呢(Orz)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants