Summarisation Model loading #3168
Unanswered
YuryAlsheuski
asked this question in
Q&A
Replies: 1 comment
-
You need to manually trace the model to pytorch or convert to onnx. You can use this as example to build your own LLM application: https://github.com/deepjavalibrary/djl/blob/master/examples/src/main/java/ai/djl/examples/inference/nlp/TextGeneration.java |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I would like to load facebook/bart-large-cnn from HuggingFace to create text summary.
I have two cases:
Unsupported model architecture: BartForConditionalGeneration for facebook/bart-large-cnn.
So my first question is - which architectures are supported by model_zoo_importer.py?
Thanks in advance for any ideas!
Beta Was this translation helpful? Give feedback.
All reactions