Export models to Tensorflow or ONNX #712
Unanswered
neelakanth
asked this question in
Q&A
Replies: 1 comment
-
Hey, we didn't do specific scripts. In our practice we did for each particular model its own conversion. At the end model is just weights, so you dump weights into, say, txt file from flashlight and then load in a proper manner in tensorflow with proper model definition. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I am looking for examples on how to export models trained with flashlight to tensorflow or ONNX for inference.
Regards,
Neelakanth
Beta Was this translation helpful? Give feedback.
All reactions