You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The text was updated successfully, but these errors were encountered:
itsucks
changed the title
What should I do to run bertforclassification model with light seq?
What should I do to run bertforclassification model with lightseq?
Oct 30, 2020
@itsucks We have implemented BERT in the model/encoder.cu.cc, but because of the diversity of downstream tasks, we didn't provide a full example. However, it can be done by the following procedures.
Write an encoder server based on server/transformer_server.cu.cc to get the last layer's output.
Use the output to compute the final result for your task.
This can be fast enough because most computing time is in Transformer blocks if your downstream task is not too heavy(like two FFN layers).
And also we are implementing a python wrapper to replace TRTIS, which provides flexibility to use LightSeq in your deployment code, it will be released as soon as possible.
No description provided.
The text was updated successfully, but these errors were encountered: