-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
issue in predict.py while loading the trained model no config.json file #6
Comments
i have the same issue too. |
Hi, we have updated the predict.py file. You should set hyper-parameters the same as training, and have the checkpoint file (model.bin) in model_dir argument. |
Hi. What about line 231,
*if 'bert' in self.args.model_type*:
This has to be changed to '*roberta*' instead right?
* if 'roberta' in self.args.model_type:*
Because your code uses *lstm* and *roberta* only.
Regards
Beatrice
…On Wed, 27 Mar 2024, 13:28 Phu Thinh, ***@***.***> wrote:
Hi, we have updated the predict.py file. You should set hyper-parameters
the same as training, and have the checkpoint file (model.bin) in model_dir
argument.
—
Reply to this email directly, view it on GitHub
<#6 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/BABB7ZM3S3OOHKDYCJAH2RLY2JKHPAVCNFSM6AAAAABDL5QQDOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAMRRHE3TCNRVHE>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Yes, you can change it to 'roberta' in this case, but our code can be used with other BERT-based PLMs. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
while loading model its not able to load model due to no config.json file present in the generated model directory.
OSError: misca does not appear to have a file named config.json. Checkout 'https://huggingface.co/misca/main' for available files.
also after loading model while prediction its asking for sequence_length and heads which is not present in the inputs dictionary.
The text was updated successfully, but these errors were encountered: