Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to add multiple extensions to --extensions? #1656

Closed
1 task done
yhyu13 opened this issue Apr 29, 2023 · 3 comments
Closed
1 task done

How to add multiple extensions to --extensions? #1656

yhyu13 opened this issue Apr 29, 2023 · 3 comments
Labels
bug Something isn't working

Comments

@yhyu13
Copy link
Contributor

yhyu13 commented Apr 29, 2023

Describe the bug

I know you can enable multiple extensions in the UI webpage, but how do you lunch them at start by using the cmd arg --extensions?

The README.md says

--extensions EXTENSIONS [EXTENSIONS ...] The list of extensions to load. If you want to load more than one extension, write the names separated by spaces.

But

--extensions llava long-term-memory 

does not seem to work

Is there an existing issue for this?

  • I have searched the existing issues

Reproduction

Install extensions

git clone https://github.com/wawawario2/long_term_memory extensions/long_term_memory
pip install -r extensions/long_term_memory/requirements.txt

git clone https://huggingface.co/wojtab/llava-13b-v0-4bit-128g.git ./models
cd ./models/llava-13b-v0-4bit-128g && git lfs pull

Then run with these args

python server.py --model llava-13b-v0-4bit-128g --model_type llama --extensions llava long-term-memory --share

Screenshot

No response

Logs

Notice only llava is loaded but not long-term-memory

Gradio HTTP request redirected to localhost :)
Warning: the gradio "share link" feature downloads a proprietary and
unaudited blob to create a reverse tunnel. This is potentially dangerous.

bin /home/hangyu5/anaconda3/envs/textgen/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cuda113.so
Loading llava-13b-v0-4bit-128g...
Found the following quantized model: models/llava-13b-v0-4bit-128g/llava-13b-v0-4bit-128g.safetensors
Loading model ...
Done.
Using the following device map for the quantized model: {'model.embed_tokens': 0, 'model.layers.0': 0, 'model.layers.1': 0, 'model.layers.2': 0, 'model.layers.3': 0, 'model.layers.4': 0, 'model.layers.5': 0, 'model.layers.6': 0, 'model.layers.7': 0, 'model.layers.8': 0, 'model.layers.9': 0, 'model.layers.10': 0, 'model.layers.11': 0, 'model.layers.12': 0, 'model.layers.13': 0, 'model.layers.14': 0, 'model.layers.15': 0, 'model.layers.16': 0, 'model.layers.17': 0, 'model.layers.18': 1, 'model.layers.19': 1, 'model.layers.20': 1, 'model.layers.21': 1, 'model.layers.22': 1, 'model.layers.23': 1, 'model.layers.24': 1, 'model.layers.25': 1, 'model.layers.26': 1, 'model.layers.27': 1, 'model.layers.28': 1, 'model.layers.29': 1, 'model.layers.30': 1, 'model.layers.31': 1, 'model.layers.32': 1, 'model.layers.33': 1, 'model.layers.34': 1, 'model.layers.35': 1, 'model.layers.36': 1, 'model.layers.37': 1, 'model.layers.38': 1, 'model.layers.39': 1, 'model.norm': 1, 'lm_head': 1}
Loaded the model in 5.02 seconds.
Loading the extension "llava"... Ok.
LLaVA - Loading openai/clip-vit-large-patch14 as torch.float32 on cuda:0...
LLaVA - Loading liuhaotian/LLaVA-13b-pretrain-projector-v0 as torch.float32 on cuda:0...
LLaVA supporting models loaded, took 3.08 seconds
Running on local URL:  http://127.0.0.1:7860
Running on public URL: https://6c4e8aa87007c87f31.gradio.live

This share link expires in 72 hours. For free permanent hosting and GPU upgrades (NEW!), check out Spaces: https://huggingface.co/spaces


### System Info

```shell
Ubuntu 22.04
dual RTX 3090
@yhyu13 yhyu13 added the bug Something isn't working label Apr 29, 2023
@IJumpAround
Copy link
Contributor

Extensions are space separated, but long-term-memory should be long_term_memory

@ChrisRhw
Copy link

I think only one extension can use the custom_generate_chat_prompt function, and both llava and long_term_memory use them. So right now only the first extension will works.

@yhyu13 yhyu13 changed the title How to assing multiple extensions to --extensions? How to add multiple extensions to --extensions? Apr 30, 2023
@yhyu13
Copy link
Contributor Author

yhyu13 commented Apr 30, 2023

It magically works now

@yhyu13 yhyu13 closed this as completed Apr 30, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants