-
Notifications
You must be signed in to change notification settings - Fork 37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OutOfMemoryError: CUDA out of memory #5
Comments
Attempt to close some gpu memory-consuming programs. For chatglm, 6GB of memory is just enough to run |
@Terrency 我正在开发RWKV-Runner 以获得最佳的体验, 避免爆显存, 并适配2GB-20GB各规模显存, 这个模型可商用, 灵活性强, 模型结构方面也比较有潜力, 大概一周内会出一个初版 |
@josStorer 你咋这么牛逼呢,期待最新作品, 我就为了搭建以后团队内部能查查资料。 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
My card is NVIDIA GeForce RTX2060, memory size is 6144MB
The text was updated successfully, but these errors were encountered: