-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
failed to load character annoy metadata, generating from scratch... #7
Comments
Will need to find a more broadly supported value I can use instead of |
thank you for identifying the issue and making a ticket. I was using https://huggingface.co/TheBloke/Manticore-13B-GGML |
Same error in WSL, using Airoboros 13B by TheBloke. 4Bit 128. failed to load character annoy metadata, generating from scratch... |
@bbecausereasonss That looks to be an unrelated issue. I've created a new ticket for this #11 You may want to subscribe to that issue to know when a resolution is merged. |
…dels and models that do not supply the hidden size in the config
…onfig fix(#7): Add a get hidden size helper fuction to better cover GGML mo…
This issue should be fixed in the main branch. |
Traceback (most recent call last):
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\gradio\routes.py", line 414, in run_predict
output = await app.get_blocks().process_api(
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\gradio\blocks.py", line 1323, in process_api
result = await self.call_function(
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\gradio\blocks.py", line 1067, in call_function
prediction = await utils.async_iteration(iterator)
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\gradio\utils.py", line 339, in async_iteration
return await iterator.anext()
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\gradio\utils.py", line 332, in anext
return await anyio.to_thread.run_sync(
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\anyio\to_thread.py", line 31, in run_sync
return await get_asynclib().run_sync_in_worker_thread(
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\anyio_backends_asyncio.py", line 937, in run_sync_in_worker_thread
return await future
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\anyio_backends_asyncio.py", line 867, in run
result = context.run(func, *args)
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\gradio\utils.py", line 315, in run_sync_iterator_async
return next(iterator)
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\modules\chat.py", line 319, in generate_chat_reply_wrapper
for i, history in enumerate(generate_chat_reply(text, shared.history, state, regenerate, _continue, loading_message=True)):
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\modules\chat.py", line 313, in generate_chat_reply
for history in chatbot_wrapper(text, history, state, regenerate=regenerate, _continue=_continue, loading_message=loading_message):
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\modules\chat.py", line 226, in chatbot_wrapper
prompt = apply_extensions('custom_generate_chat_prompt', text, state, **kwargs)
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\modules\extensions.py", line 193, in apply_extensions
return EXTENSION_MAP[typ](*args, **kwargs)
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\modules\extensions.py", line 80, in _apply_custom_generate_chat_prompt
return extension.custom_generate_chat_prompt(text, state, **kwargs)
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\extensions\annoy_ltm\script.py", line 672, in custom_generate_chat_prompt
return generator.custom_generate_chat_prompt(user_input, state, **kwargs)
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\extensions\annoy_ltm\script.py", line 519, in custom_generate_chat_prompt
loaded_annoy_index = AnnoyIndex(shared.model.model.config.hidden_size, 'angular')
AttributeError: 'Llama' object has no attribute 'config'
failed to load character annoy metadata, generating from scratch...
Traceback (most recent call last):
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\gradio\routes.py", line 414, in run_predict
output = await app.get_blocks().process_api(
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\gradio\blocks.py", line 1323, in process_api
result = await self.call_function(
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\gradio\blocks.py", line 1067, in call_function
prediction = await utils.async_iteration(iterator)
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\gradio\utils.py", line 339, in async_iteration
return await iterator.anext()
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\gradio\utils.py", line 332, in anext
return await anyio.to_thread.run_sync(
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\anyio\to_thread.py", line 31, in run_sync
return await get_asynclib().run_sync_in_worker_thread(
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\anyio_backends_asyncio.py", line 937, in run_sync_in_worker_thread
return await future
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\anyio_backends_asyncio.py", line 867, in run
result = context.run(func, *args)
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\gradio\utils.py", line 315, in run_sync_iterator_async
return next(iterator)
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\modules\chat.py", line 319, in generate_chat_reply_wrapper
for i, history in enumerate(generate_chat_reply(text, shared.history, state, regenerate, _continue, loading_message=True)):
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\modules\chat.py", line 313, in generate_chat_reply
for history in chatbot_wrapper(text, history, state, regenerate=regenerate, _continue=_continue, loading_message=loading_message):
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\modules\chat.py", line 226, in chatbot_wrapper
prompt = apply_extensions('custom_generate_chat_prompt', text, state, **kwargs)
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\modules\extensions.py", line 193, in apply_extensions
return EXTENSION_MAP[typ](*args, **kwargs)
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\modules\extensions.py", line 80, in _apply_custom_generate_chat_prompt
return extension.custom_generate_chat_prompt(text, state, **kwargs)
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\extensions\annoy_ltm\script.py", line 672, in custom_generate_chat_prompt
return generator.custom_generate_chat_prompt(user_input, state, **kwargs)
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\extensions\annoy_ltm\script.py", line 519, in custom_generate_chat_prompt
loaded_annoy_index = AnnoyIndex(shared.model.model.config.hidden_size, 'angular')
AttributeError: 'Llama' object has no attribute 'config'
failed to load character annoy metadata, generating from scratch...
Traceback (most recent call last):
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\gradio\routes.py", line 414, in run_predict
output = await app.get_blocks().process_api(
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\gradio\blocks.py", line 1323, in process_api
result = await self.call_function(
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\gradio\blocks.py", line 1067, in call_function
prediction = await utils.async_iteration(iterator)
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\gradio\utils.py", line 339, in async_iteration
return await iterator.anext()
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\gradio\utils.py", line 332, in anext
return await anyio.to_thread.run_sync(
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\anyio\to_thread.py", line 31, in run_sync
return await get_asynclib().run_sync_in_worker_thread(
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\anyio_backends_asyncio.py", line 937, in run_sync_in_worker_thread
return await future
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\anyio_backends_asyncio.py", line 867, in run
result = context.run(func, *args)
File "K:\1ai\oog\oobabooga-windows\installer_files\env\lib\site-packages\gradio\utils.py", line 315, in run_sync_iterator_async
return next(iterator)
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\modules\chat.py", line 319, in generate_chat_reply_wrapper
for i, history in enumerate(generate_chat_reply(text, shared.history, state, regenerate, _continue, loading_message=True)):
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\modules\chat.py", line 313, in generate_chat_reply
for history in chatbot_wrapper(text, history, state, regenerate=regenerate, _continue=_continue, loading_message=loading_message):
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\modules\chat.py", line 226, in chatbot_wrapper
prompt = apply_extensions('custom_generate_chat_prompt', text, state, **kwargs)
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\modules\extensions.py", line 193, in apply_extensions
return EXTENSION_MAP[typ](*args, **kwargs)
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\modules\extensions.py", line 80, in _apply_custom_generate_chat_prompt
return extension.custom_generate_chat_prompt(text, state, **kwargs)
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\extensions\annoy_ltm\script.py", line 672, in custom_generate_chat_prompt
return generator.custom_generate_chat_prompt(user_input, state, **kwargs)
File "K:\1ai\oog\oobabooga-windows\text-generation-webui\extensions\annoy_ltm\script.py", line 519, in custom_generate_chat_prompt
loaded_annoy_index = AnnoyIndex(shared.model.model.config.hidden_size, 'angular')
AttributeError: 'Llama' object has no attribute 'config'
Originally posted by @emangamer in #2 (comment)
The text was updated successfully, but these errors were encountered: