You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Notice only llava is loaded but not long-term-memory
Gradio HTTP request redirected to localhost :)
Warning: the gradio "share link" feature downloads a proprietary and
unaudited blob to create a reverse tunnel. This is potentially dangerous.
bin /home/hangyu5/anaconda3/envs/textgen/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cuda113.so
Loading llava-13b-v0-4bit-128g...
Found the following quantized model: models/llava-13b-v0-4bit-128g/llava-13b-v0-4bit-128g.safetensors
Loading model ...
Done.
Using the following device map for the quantized model: {'model.embed_tokens': 0, 'model.layers.0': 0, 'model.layers.1': 0, 'model.layers.2': 0, 'model.layers.3': 0, 'model.layers.4': 0, 'model.layers.5': 0, 'model.layers.6': 0, 'model.layers.7': 0, 'model.layers.8': 0, 'model.layers.9': 0, 'model.layers.10': 0, 'model.layers.11': 0, 'model.layers.12': 0, 'model.layers.13': 0, 'model.layers.14': 0, 'model.layers.15': 0, 'model.layers.16': 0, 'model.layers.17': 0, 'model.layers.18': 1, 'model.layers.19': 1, 'model.layers.20': 1, 'model.layers.21': 1, 'model.layers.22': 1, 'model.layers.23': 1, 'model.layers.24': 1, 'model.layers.25': 1, 'model.layers.26': 1, 'model.layers.27': 1, 'model.layers.28': 1, 'model.layers.29': 1, 'model.layers.30': 1, 'model.layers.31': 1, 'model.layers.32': 1, 'model.layers.33': 1, 'model.layers.34': 1, 'model.layers.35': 1, 'model.layers.36': 1, 'model.layers.37': 1, 'model.layers.38': 1, 'model.layers.39': 1, 'model.norm': 1, 'lm_head': 1}
Loaded the model in 5.02 seconds.
Loading the extension "llava"... Ok.
LLaVA - Loading openai/clip-vit-large-patch14 as torch.float32 on cuda:0...
LLaVA - Loading liuhaotian/LLaVA-13b-pretrain-projector-v0 as torch.float32 on cuda:0...
LLaVA supporting models loaded, took 3.08 seconds
Running on local URL: http://127.0.0.1:7860
Running on public URL: https://6c4e8aa87007c87f31.gradio.live
This share link expires in 72 hours. For free permanent hosting and GPU upgrades (NEW!), check out Spaces: https://huggingface.co/spaces
### System Info
```shell
Ubuntu 22.04
dual RTX 3090
The text was updated successfully, but these errors were encountered:
I think only one extension can use the custom_generate_chat_prompt function, and both llava and long_term_memory use them. So right now only the first extension will works.
yhyu13
changed the title
How to assing multiple extensions to --extensions?
How to add multiple extensions to --extensions?
Apr 30, 2023
Describe the bug
I know you can enable multiple extensions in the UI webpage, but how do you lunch them at start by using the cmd arg --extensions?
The README.md says
But
does not seem to work
Is there an existing issue for this?
Reproduction
Install extensions
Then run with these args
Screenshot
No response
Logs
The text was updated successfully, but these errors were encountered: