-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Solution for choosing the GPT model to be used #4
Comments
I vouched for this in the discussion. This will be great if one can also hookup a local LLM |
I'm not able to use the gpt-4-0613 model required by Aria (see the attachment). It seems a common issue although on July 6 OpenAI released GPT-4 for all users. it would be nice to add an option that allows you to choose the model to use. Thank you! |
Hi @lparolari , right now this plugin heavily uses the new "function calling" feature in GPT-4-0613 (technically this feature is also available in GPT-3.5-Turbo-0613 but is far less reliable). I hope your access issue will be resolved soon by OpenAI. In the future, I am definitely interested in exploring open-source LLM options such as Llama2. |
Hello @lifan0127. What a great evolution of the project that I'm following and you always diligently helping, again my thanks. It would be possible to put a field to choose the model that the plugin will access. This would be interesting for token economy and other jobs where it is not necessary to use GPT 3.5, or even update the model to choose GPT 4, for example.
The text was updated successfully, but these errors were encountered: