-
Notifications
You must be signed in to change notification settings - Fork 475
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to use pretrained models for prediction? Is there a tutorial? #13
Comments
Plain text without format requirement corresponds to the BRAT format with no annotation, which is supported by NeuroNER.
You should set in the
Here is some instructions on how to use the
Please let you know if that answers your question! Point taken, we should make it clearer in the documentation. |
Yes. Now it works!
Here's what I missed:
Many thanks, Franck - you just saved my day! |
Thanks for the feedback! |
@Franck-Dernoncourt How can I load a pre-train model in memory, and classify new sentences on the go? I have followed this approach, and I can classify sentences if I have them in proper folder/file location. I was wondering if I can keep the model loaded in memory and somehow can call UPDATE: I think I missed this, with below I can get results for the input sentence..
I think I can make changes to |
Did you make changes to predict method to accept list of text and return back all results without saving them to temp directory?. Let me know how you did. Thanks! @new2scala |
Have been trying for 2 days with little progress. Seems that the tool works only with this specific format:
Ideally, the input for the prediction mode should be plain text without format requirement. I've tried to use spacy to convert plain text to the format. This is what I got so far (no idea how to generate the 3rd column):
And it seems that the result actually depends on the supplied label (the 4th column, contrary to what's said the documentation), but I'm probably missing something here...
The text was updated successfully, but these errors were encountered: