Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error on privateGPT.py execution #459

Closed
shameforest opened this issue May 24, 2023 · 18 comments
Closed

Error on privateGPT.py execution #459

shameforest opened this issue May 24, 2023 · 18 comments
Labels
bug Something isn't working primordial Related to the primordial version of PrivateGPT, which is now frozen in favour of the new PrivateGPT

Comments

@shameforest
Copy link

shameforest commented May 24, 2023

I managed to use ingest.py and load the txt files.
Afterwards I try to execture privateGPT.py and get the following error message (every time):

Using embedded DuckDB with persistence: data will be stored in: db
Traceback (most recent call last):
File "c:\privateGPT-main\privateGPT.py", line 75, in
main()
File "c:\privateGPT-main\privateGPT.py", line 35, in main
llm = GPT4All(model=model_path, n_ctx=model_n_ctx, backend='gptj', callbacks=callbacks, verbose=False)
File "pydantic\main.py", line 341, in pydantic.main.BaseModel.init
pydantic.error_wrappers.ValidationError: 1 validation error for GPT4All
root
Failed to retrieve model (type=value_error)


  • Windows 11
  • Python version 3.10.9

my env file looks like this:

MODEL_TYPE=GPT4All
PERSIST_DIRECTORY=db
MODEL_PATH=models\ggml-gpt4all-j-v1.3-groovy.bin
MODEL_N_CTX=1000
EMBEDDINGS_MODEL_NAME=distiluse-base-multilingual-cased-v2

@shameforest shameforest added the bug Something isn't working label May 24, 2023
@sandyrs9421
Copy link

i too see the same error . any luck on how we can resolve this ?

my env file -
PERSIST_DIRECTORY=db
MODEL_TYPE=GPT4All
MODEL_PATH=models/ggml-gpt4all-j-v1.3-groovy.bin
EMBEDDINGS_MODEL_NAME=/Users/FBT/Desktop/Projects/privategpt/privateGPT/models/ggml-model-q4_0.bin
MODEL_N_CTX=1000

@christopherpickering
Copy link

Did you download the model in this path models\ggml-gpt4all-j-v1.3-groovy.bin?

@shameforest
Copy link
Author

Yes it is in that path, I copied the relative path to the file and inserted it in the env file, like described above.

@UserB-tm
Copy link

I'm also able to ingest but can't seem's to stick at model load.
image

@christopherpickering
Copy link

Does the groovy model work? I removed the "backend=gptj" in the privateGPT.py file to get the model to load.

@shameforest
Copy link
Author

I removed backend=gptj now. This is my error message:

Traceback (most recent call last):
File "c:\privateGPT-main\privateGPT.py", line 75, in
main()
File "c:\privateGPT-main\privateGPT.py", line 35, in main
llm = GPT4All(model=model_path, n_ctx=model_n_ctx, callbacks=callbacks, verbose=False)
File "pydantic\main.py", line 341, in pydantic.main.BaseModel.init
pydantic.error_wrappers.ValidationError: 1 validation error for GPT4All
root
No corresponding model for provided filename models\ggml-gpt4all-j-v1.3-groovy.bin.
If this is a custom model, make sure to specify a valid model_type.
(type=value_error)


in my models folder are these two files:
models\ggml-gpt4all-j-v1.3-groovy.bin
models\ggml-model-q4_0.bin

@Andrew-Mayorga
Copy link

make sure the llm is inside the models folder

@shameforest
Copy link
Author

You are speaking of:
models\ggml-gpt4all-j-v1.3-groovy.bin
models\ggml-model-q4_0.bin

right?

They are both in the models folder, in the real file system (C:\privateGPT-main\models) and inside Visual Studio Code (models\ggml-gpt4all-j-v1.3-groovy.bin) aswell.
Is there anything else that could be the problem?

@HenryUn
Copy link

HenryUn commented May 28, 2023

I guess the error might cause by "model" which it should be "model_path"

The error show:
File "c:\privateGPT-main\privateGPT.py", line 35, in main
llm = GPT4All(model=model_path, n_ctx=model_n_ctx, callbacks=callbacks, verbose=False)

I checked the privateGPT.py file. it only has defined :model_path = os.environ.get('MODEL_PATH')
sso, I guess that the code should be

llm = GPT4All(model_path=model_path, n_ctx=model_n_ctx, callbacks=callbacks, verbose=False)

Please do feel free to correct any of my mistakes.. thanks.

@shameforest
Copy link
Author

I replaced your code with my line 35, but it just gets this error:

Using embedded DuckDB with persistence: data will be stored in: db
Traceback (most recent call last):
File "c:\privateGPT-main\privateGPT.py", line 75, in
main()
File "c:\privateGPT-main\privateGPT.py", line 35, in main
llm = GPT4All(model_path=model_path, n_ctx=model_n_ctx, callbacks=callbacks, verbose=False)
File "pydantic\main.py", line 339, in pydantic.main.BaseModel.init
File "pydantic\main.py", line 1102, in pydantic.main.validate_model
File "C:\Users\Amin\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\llms\gpt4all.py", line 135, in validate_environment
full_path = values["model"]
KeyError: 'model'

@buhajbej
Copy link

Same here. The "model" to "model_path" helped with the privateGPT file error, but now stuck with the gpt4all file error.

@bdbais
Copy link

bdbais commented May 30, 2023

When I try to run privateGPT: I have last version of python... how to run on compatible version mode?

python3 privateGPT.py
File "/home/privateGPT/privateGPT/privateGPT.py", line 32
match model_type:
^
SyntaxError: invalid syntax

python3 --version
Python 3.9.7

uname -an
Linux hostname 5.13.0-27-generic #29-Ubuntu SMP Wed Jan 12 17:36:47 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux

@Fridry
Copy link

Fridry commented May 30, 2023

I had the same error with this .env file:

PERSIST_DIRECTORY=db
MODEL_TYPE=GPT4All
MODEL_PATH=models/ggml-gpt4all-j-v1.3-groovy.bin
EMBEDDINGS_MODEL_NAME=/Users/FBT/Desktop/Projects/privategpt/privateGPT/models/ggml-model-q4_0.bin
MODEL_N_CTX=1000

I just replace the models on MODEL_TYPE for source_documents. My MODEL_TYPE is like this now:
MODEL_TYPE: source_documents/ggml-gpt4all-j-v1.3-groovy.bin

@geller6980
Copy link

I had the same error with this .env file:

PERSIST_DIRECTORY=db MODEL_TYPE=GPT4All MODEL_PATH=models/ggml-gpt4all-j-v1.3-groovy.bin EMBEDDINGS_MODEL_NAME=/Users/FBT/Desktop/Projects/privategpt/privateGPT/models/ggml-model-q4_0.bin MODEL_N_CTX=1000

I just replace the models on MODEL_TYPE for source_documents. My MODEL_TYPE is like this now: MODEL_TYPE: source_documents/ggml-gpt4all-j-v1.3-groovy.bin

This worked perfectly!

I also commented out the lines under "Print the relevant sources used for the answer" to shorten the text.

Is there a way to speed up the response time?

@hansvanzutphen
Copy link

Found it. Don't use a \ in the path. So replace
MODEL_PATH=models\ggml-gpt4all-j-v1.3-groovy.bin
by
MODEL_PATH=models/ggml-gpt4all-j-v1.3-groovy.bin

@mvange
Copy link

mvange commented Jun 5, 2023

Solved:

#89

@melroy89 melroy89 mentioned this issue Jun 5, 2023
@m3hrdadpourkheiri
Copy link

Hi dear
I run ingesty.py by python 3.9.13 and work correctly
but when run privateGPT.py return this error
C:\Users\Mehrdad\Desktop\AI\newpgpt\privateGPT>python privateGPT.py
File "C:\Users\Mehrdad\Desktop\AI\newpgpt\privateGPT\privateGPT.py", line 34
match model_type:
^
SyntaxError: invalid syntax

please help me

@naveeagrawal
Copy link

@m3hrdadpourkheiri match statement would work only in Python 3.10 above.

@imartinez imartinez added the primordial Related to the primordial version of PrivateGPT, which is now frozen in favour of the new PrivateGPT label Oct 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working primordial Related to the primordial version of PrivateGPT, which is now frozen in favour of the new PrivateGPT
Projects
None yet
Development

No branches or pull requests