Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AttributeError: function 'TVMGetLastPythonError' not found. Did you mean: 'TVMGetLastError'? #15

Open
helqasem opened this issue Oct 3, 2023 · 1 comment

Comments

@helqasem
Copy link

helqasem commented Oct 3, 2023

Hi There,

While attempting to run a basic prompt on local Llama2-7b plugin I received and attribute error:
"AttributeError: function 'TVMGetLastPythonError' not found. Did you mean: 'TVMGetLastError'?"

Environment is Windows 11, Python 3.11.

During set-up I hit the same issue in setup regarding tvm as detailed here:
mlc-ai/mlc-llm#875. I was able to progress by downloading libzstd and renaming to zstd.dll as noted in that issue,

I've followed the instructions here for downloading and using plugins: https://pypi.org/project/llm/

After install and set-up the first use suggested is:
"llm -m llama2 'difference between a llama and an alpaca'"
The Attribute Error is received after running this command.

Full stack trace:

`PS C:\Users\elqas\OneDrive\DevStuff\CSIDB\LLMAI> llm -m llama2_7b 'difference between a llama and an alpaca'
Traceback (most recent call last):
File "C:\Users\elqas\AppData\Local\Programs\Python\Python311\Lib\site-packages\llm\cli.py", line 276, in prompt
for chunk in response:
File "C:\Users\elqas\AppData\Local\Programs\Python\Python311\Lib\site-packages\llm\models.py", line 91, in iter
for chunk in self.model.execute(
File "C:\Users\elqas\AppData\Local\Programs\Python\Python311\Lib\site-packages\llm_mlc.py", line 302, in execute
self.chat_mod = StreamingChatModule(model=self.model_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\elqas\AppData\Roaming\Python\Python311\site-packages\mlc_chat\chat_module.py", line 609, in init
self._reload(self.lib_path, self.model_path, user_chat_config_json_str)
File "C:\Users\elqas\AppData\Roaming\Python\Python311\site-packages\mlc_chat\chat_module.py", line 790, in _reload
self.reload_func(lib, model_path, app_config_json)
File "C:\Users\elqas\AppData\Roaming\Python\Python311\site-packages\tvm_ffi_ctypes\packed_func.py", line 239, in call
raise_last_ffi_error()
File "C:\Users\elqas\AppData\Roaming\Python\Python311\site-packages\tvm_ffi\base.py", line 415, in raise_last_ffi_error
LIB.TVMGetLastPythonError.restype = ctypes.c_void_p
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\elqas\AppData\Local\Programs\Python\Python311\Lib\ctypes_init
.py", line 389, in getattr
func = self.getitem(name)
^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\elqas\AppData\Local\Programs\Python\Python311\Lib\ctypes_init
.py", line 394, in getitem
func = self._FuncPtr((name_or_ordinal, self))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: function 'TVMGetLastPythonError' not found. Did you mean: 'TVMGetLastError'?

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\elqas\AppData\Roaming\Python\Python311\site-packages\click\core.py", line 1078, in main
rv = self.invoke(ctx)
^^^^^^^^^^^^^^^^
File "C:\Users\elqas\AppData\Roaming\Python\Python311\site-packages\click\core.py", line 1688, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\elqas\AppData\Roaming\Python\Python311\site-packages\click\core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\elqas\AppData\Roaming\Python\Python311\site-packages\click\core.py", line 783, in invoke
return __callback(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\elqas\AppData\Local\Programs\Python\Python311\Lib\site-packages\llm\cli.py", line 283, in prompt
raise click.ClickException(str(ex))
click.exceptions.ClickException: function 'TVMGetLastPythonError' not found

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "", line 198, in run_module_as_main
File "", line 88, in run_code
File "C:\Users\elqas\AppData\Local\Programs\Python\Python311\Scripts\llm.exe_main
.py", line 7, in
File "C:\Users\elqas\AppData\Roaming\Python\Python311\site-packages\click\core.py", line 1157, in call
return self.main(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\elqas\AppData\Roaming\Python\Python311\site-packages\click\core.py", line 1095, in main
e.show()
File "C:\Users\elqas\AppData\Roaming\Python\Python311\site-packages\click\exceptions.py", line 44, in show
echo(
("Error: {message}").format(message=self.format_message()), file=file)
File "C:\Users\elqas\AppData\Roaming\Python\Python311\site-packages\click\utils.py", line 318, in echo
file.write(out) # type: ignore
^^^^^^^^^^^^^^^
File "C:\Users\elqas\AppData\Roaming\Python\Python311\site-packages\click_compat.py", line 542, in _safe_write
return _write(s)
^^^^^^^^^
File "C:\Users\elqas\AppData\Roaming\Python\Python311\site-packages\colorama\ansitowin32.py", line 47, in write
self.__convertor.write(text)
File "C:\Users\elqas\AppData\Roaming\Python\Python311\site-packages\colorama\ansitowin32.py", line 177, in write
self.write_and_convert(text)
File "C:\Users\elqas\AppData\Roaming\Python\Python311\site-packages\colorama\ansitowin32.py", line 205, in write_and_convert
self.write_plain_text(text, cursor, len(text))
File "C:\Users\elqas\AppData\Roaming\Python\Python311\site-packages\colorama\ansitowin32.py", line 210, in write_plain_text
self.wrapped.write(text[start:end])
File "C:\Users\elqas\AppData\Roaming\Python\Python311\site-packages\click_winconsole.py", line 192, in write
return self._text_stream.write(x)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\elqas\AppData\Roaming\Python\Python311\site-packages\click_winconsole.py", line 177, in write
raise OSError(self._get_error_message(GetLastError()))
OSError: Windows error 6`

Same error occurs when attempting to use "llm chat -m llama2"

Any assistance appreciated.

@zeeroh
Copy link

zeeroh commented Oct 15, 2023

Yeah, I'm also getting the exact same error. My setup is the same, trying to run the same commands from the llm instructions as you are, and am getting the same result. I too had to copy the zstd.dll file when running llm mlc setup in order to get the setup to work. The only difference I can see is that I'm on win 10 instead of 11. 🤷

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants