heavy model by 100-500MB its real? #415
shurshilov
started this conversation in
Ideas
Replies: 1 comment 3 replies
-
what is the exact use-case? if its to run in browser, then no as running inside browser always go through more layers than necessary, either wegbl or wasm or webgpu and you're limited to how big of a model you're allowed to run in browser to start with. if its to run in backend, then yes. accuracy of what? but in general, that is not the purpose of this library. |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
What if we make some very heavy but more accurate model? for kiosk modes, when system administrators download at least 100-500 megabytes of neural network once, but they will do it only once, but the accuracy, as I understand it, will be higher?
Beta Was this translation helpful? Give feedback.
All reactions