Issues: huggingface/transformers-bloom-inference
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
ValueError: Couldn't instantiate the backend tokenizer from one of:
#101
opened Jun 30, 2023 by
SeekPoint
The Makefile execution was successful, but there is no response when entering text.
#96
opened Jun 2, 2023 by
dizhenx
AttributeError: 'BloomForCausalLM' object has no attribute 'module'
#95
opened Jun 1, 2023 by
detectiveJoshua
Inference(chatbot) does not work as expected on 2 gpus with bigscience/bloom-7b1 model
#90
opened May 19, 2023 by
dantalyon
Big batchsize cause OOM in bloom-ds-inference.py, how to adjust max_split_size_mb value
#84
opened Apr 27, 2023 by
tohneecao
"bloom-ds-zero-inference.py" works but "inference_server.cli --deployment_framework ds_zero" fails
#68
opened Mar 22, 2023 by
richarddwang
The generated results are different when using greedy search during generation
#65
opened Mar 14, 2023 by
FrostML
ProTip!
Add no:assignee to see everything that’s not assigned.