You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Dec 16, 2022. It is now read-only.
Hello, I have trained a coreference model which resulted in model_1.tar.gz . Afterwards, I decided to use this model in the training config by adding the following key to the .jsonnet.
This worked nicely and resulted in a better model_2.tar.gz. However, upon loading model_2.tar.gz via Predictor.from_path("./allennlp_output_1/model_1.tar.gz") it requires the allennlp_output_1/model_1.tar.gz to be present as well.
The text was updated successfully, but these errors were encountered:
@epwalsh similarly, loading a model with a custom datasetloader, require that dataset loader to have been registered, even though I just want to use the model for inference.
So, the main issue here was brought up in #5211, but we're not likely to implement the solution anytime soon. We just don't have the bandwidth are we're no longer focused on adding new features to AllenNLP. But there is a work-around: you could just update the config.json file within the archive of model_2.tar.gz and remove the reference to allennlp_output_1/model_1.tar.gz.
Hello, I have trained a coreference model which resulted in
model_1.tar.gz
. Afterwards, I decided to use this model in the training config by adding the following key to the.jsonnet
.This worked nicely and resulted in a better
model_2.tar.gz
. However, upon loadingmodel_2.tar.gz
viaPredictor.from_path("./allennlp_output_1/model_1.tar.gz")
it requires theallennlp_output_1/model_1.tar.gz
to be present as well.The text was updated successfully, but these errors were encountered: