Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: When running by example getting error: TypeError: 'dict' object is not callable #13867

Closed
2 of 14 tasks
lolevsky opened this issue Nov 26, 2023 · 4 comments
Closed
2 of 14 tasks
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: models Related to LLMs or chat model modules

Comments

@lolevsky
Copy link

System Info

Environment variable

BENTOML_DEBUG=''
BENTOML_QUIET=''
BENTOML_BUNDLE_LOCAL_BUILD=''
BENTOML_DO_NOT_TRACK=''
BENTOML_CONFIG=''
BENTOML_CONFIG_OPTIONS=''
BENTOML_PORT=''
BENTOML_HOST=''
BENTOML_API_WORKERS=''

System information

bentoml: 1.1.10
python: 3.11.5
platform: Linux-6.2.0-37-generic-x86_64-with-glibc2.35
uid_gid: 1000:1000
conda: 23.7.4
in_conda_env: True

conda_packages
name: openllm
channels:
  - defaults
dependencies:
  - _libgcc_mutex=0.1=main
  - _openmp_mutex=5.1=1_gnu
  - bzip2=1.0.8=h7b6447c_0
  - ca-certificates=2023.08.22=h06a4308_0
  - ld_impl_linux-64=2.38=h1181459_1
  - libffi=3.4.4=h6a678d5_0
  - libgcc-ng=11.2.0=h1234567_1
  - libgomp=11.2.0=h1234567_1
  - libstdcxx-ng=11.2.0=h1234567_1
  - libuuid=1.41.5=h5eee18b_0
  - ncurses=6.4=h6a678d5_0
  - openssl=3.0.12=h7f8727e_0
  - pip=23.3.1=py311h06a4308_0
  - python=3.11.5=h955ad1f_0
  - readline=8.2=h5eee18b_0
  - setuptools=68.0.0=py311h06a4308_0
  - sqlite=3.41.2=h5eee18b_0
  - tk=8.6.12=h1ccaba5_0
  - wheel=0.41.2=py311h06a4308_0
  - xz=5.4.2=h5eee18b_0
  - zlib=1.2.13=h5eee18b_0
  - pip:
      - accelerate==0.24.1
      - aiohttp==3.9.0
      - aiosignal==1.3.1
      - anyio==3.7.1
      - appdirs==1.4.4
      - asgiref==3.7.2
      - attrs==23.1.0
      - beautifulsoup4==4.12.2
      - bentoml==1.1.10
      - bitsandbytes==0.41.2.post2
      - build==0.10.0
      - cattrs==23.1.2
      - certifi==2023.11.17
      - charset-normalizer==3.3.2
      - circus==0.18.0
      - click==8.1.7
      - click-option-group==0.5.6
      - cloudpickle==3.0.0
      - coloredlogs==15.0.1
      - contextlib2==21.6.0
      - cuda-python==12.3.0
      - dataclasses-json==0.6.2
      - datasets==2.15.0
      - deepmerge==1.1.0
      - deprecated==1.2.14
      - dill==0.3.7
      - distlib==0.3.7
      - distro==1.8.0
      - einops==0.7.0
      - fastapi==0.104.1
      - fastcore==1.5.29
      - filelock==3.13.1
      - filetype==1.2.0
      - frozenlist==1.4.0
      - fs==2.4.16
      - fsspec==2023.10.0
      - ghapi==1.0.4
      - greenlet==3.0.1
      - h11==0.14.0
      - httpcore==1.0.2
      - httptools==0.6.1
      - httpx==0.25.2
      - huggingface-hub==0.19.4
      - humanfriendly==10.0
      - idna==3.6
      - importlib-metadata==6.8.0
      - inflection==0.5.1
      - jinja2==3.1.2
      - jsonpatch==1.33
      - jsonpointer==2.4
      - jsonschema==4.20.0
      - jsonschema-specifications==2023.11.1
      - langchain==0.0.340
      - langsmith==0.0.66
      - markdown-it-py==3.0.0
      - markupsafe==2.1.3
      - marshmallow==3.20.1
      - mdurl==0.1.2
      - mpmath==1.3.0
      - msgpack==1.0.7
      - multidict==6.0.4
      - multiprocess==0.70.15
      - mypy-extensions==1.0.0
      - networkx==3.2.1
      - ninja==1.11.1.1
      - numpy==1.26.2
      - nvidia-cublas-cu12==12.1.3.1
      - nvidia-cuda-cupti-cu12==12.1.105
      - nvidia-cuda-nvrtc-cu12==12.1.105
      - nvidia-cuda-runtime-cu12==12.1.105
      - nvidia-cudnn-cu12==8.9.2.26
      - nvidia-cufft-cu12==11.0.2.54
      - nvidia-curand-cu12==10.3.2.106
      - nvidia-cusolver-cu12==11.4.5.107
      - nvidia-cusparse-cu12==12.1.0.106
      - nvidia-ml-py==11.525.150
      - nvidia-nccl-cu12==2.18.1
      - nvidia-nvjitlink-cu12==12.3.101
      - nvidia-nvtx-cu12==12.1.105
      - openllm==0.4.28
      - openllm-client==0.4.28
      - openllm-core==0.4.28
      - opentelemetry-api==1.20.0
      - opentelemetry-instrumentation==0.41b0
      - opentelemetry-instrumentation-aiohttp-client==0.41b0
      - opentelemetry-instrumentation-asgi==0.41b0
      - opentelemetry-sdk==1.20.0
      - opentelemetry-semantic-conventions==0.41b0
      - opentelemetry-util-http==0.41b0
      - optimum==1.14.1
      - orjson==3.9.10
      - packaging==23.2
      - pandas==2.1.3
      - pathspec==0.11.2
      - pillow==10.1.0
      - pip-requirements-parser==32.0.1
      - pip-tools==7.3.0
      - platformdirs==4.0.0
      - prometheus-client==0.19.0
      - protobuf==4.25.1
      - psutil==5.9.6
      - pyarrow==14.0.1
      - pyarrow-hotfix==0.6
      - pydantic==1.10.13
      - pygments==2.17.2
      - pyparsing==3.1.1
      - pyproject-hooks==1.0.0
      - python-dateutil==2.8.2
      - python-dotenv==1.0.0
      - python-json-logger==2.0.7
      - python-multipart==0.0.6
      - pytz==2023.3.post1
      - pyyaml==6.0.1
      - pyzmq==25.1.1
      - ray==2.8.0
      - referencing==0.31.0
      - regex==2023.10.3
      - requests==2.31.0
      - rich==13.7.0
      - rpds-py==0.13.1
      - safetensors==0.4.0
      - schema==0.7.5
      - scipy==1.11.4
      - sentencepiece==0.1.99
      - simple-di==0.1.5
      - six==1.16.0
      - sniffio==1.3.0
      - soupsieve==2.5
      - sqlalchemy==2.0.23
      - starlette==0.27.0
      - sympy==1.12
      - tenacity==8.2.3
      - tokenizers==0.15.0
      - torch==2.1.0
      - tornado==6.3.3
      - tqdm==4.66.1
      - transformers==4.35.2
      - triton==2.1.0
      - typing-extensions==4.8.0
      - typing-inspect==0.9.0
      - tzdata==2023.3
      - urllib3==2.1.0
      - uvicorn==0.24.0.post1
      - uvloop==0.19.0
      - virtualenv==20.24.7
      - vllm==0.2.2
      - watchfiles==0.21.0
      - websockets==12.0
      - wrapt==1.16.0
      - xformers==0.0.22.post7
      - xxhash==3.4.1
      - yarl==1.9.3
      - zipp==3.17.0
prefix: /home/lolevsky/anaconda3/envs/openllm
pip_packages
accelerate==0.24.1
aiohttp==3.9.0
aiosignal==1.3.1
anyio==3.7.1
appdirs==1.4.4
asgiref==3.7.2
attrs==23.1.0
beautifulsoup4==4.12.2
bentoml==1.1.10
bitsandbytes==0.41.2.post2
build==0.10.0
cattrs==23.1.2
certifi==2023.11.17
charset-normalizer==3.3.2
circus==0.18.0
click==8.1.7
click-option-group==0.5.6
cloudpickle==3.0.0
coloredlogs==15.0.1
contextlib2==21.6.0
cuda-python==12.3.0
dataclasses-json==0.6.2
datasets==2.15.0
deepmerge==1.1.0
Deprecated==1.2.14
dill==0.3.7
distlib==0.3.7
distro==1.8.0
einops==0.7.0
fastapi==0.104.1
fastcore==1.5.29
filelock==3.13.1
filetype==1.2.0
frozenlist==1.4.0
fs==2.4.16
fsspec==2023.10.0
ghapi==1.0.4
greenlet==3.0.1
h11==0.14.0
httpcore==1.0.2
httptools==0.6.1
httpx==0.25.2
huggingface-hub==0.19.4
humanfriendly==10.0
idna==3.6
importlib-metadata==6.8.0
inflection==0.5.1
Jinja2==3.1.2
jsonpatch==1.33
jsonpointer==2.4
jsonschema==4.20.0
jsonschema-specifications==2023.11.1
langchain==0.0.340
langsmith==0.0.66
markdown-it-py==3.0.0
MarkupSafe==2.1.3
marshmallow==3.20.1
mdurl==0.1.2
mpmath==1.3.0
msgpack==1.0.7
multidict==6.0.4
multiprocess==0.70.15
mypy-extensions==1.0.0
networkx==3.2.1
ninja==1.11.1.1
numpy==1.26.2
nvidia-cublas-cu12==12.1.3.1
nvidia-cuda-cupti-cu12==12.1.105
nvidia-cuda-nvrtc-cu12==12.1.105
nvidia-cuda-runtime-cu12==12.1.105
nvidia-cudnn-cu12==8.9.2.26
nvidia-cufft-cu12==11.0.2.54
nvidia-curand-cu12==10.3.2.106
nvidia-cusolver-cu12==11.4.5.107
nvidia-cusparse-cu12==12.1.0.106
nvidia-ml-py==11.525.150
nvidia-nccl-cu12==2.18.1
nvidia-nvjitlink-cu12==12.3.101
nvidia-nvtx-cu12==12.1.105
openllm==0.4.28
openllm-client==0.4.28
openllm-core==0.4.28
opentelemetry-api==1.20.0
opentelemetry-instrumentation==0.41b0
opentelemetry-instrumentation-aiohttp-client==0.41b0
opentelemetry-instrumentation-asgi==0.41b0
opentelemetry-sdk==1.20.0
opentelemetry-semantic-conventions==0.41b0
opentelemetry-util-http==0.41b0
optimum==1.14.1
orjson==3.9.10
packaging==23.2
pandas==2.1.3
pathspec==0.11.2
Pillow==10.1.0
pip-requirements-parser==32.0.1
pip-tools==7.3.0
platformdirs==4.0.0
prometheus-client==0.19.0
protobuf==4.25.1
psutil==5.9.6
pyarrow==14.0.1
pyarrow-hotfix==0.6
pydantic==1.10.13
Pygments==2.17.2
pyparsing==3.1.1
pyproject_hooks==1.0.0
python-dateutil==2.8.2
python-dotenv==1.0.0
python-json-logger==2.0.7
python-multipart==0.0.6
pytz==2023.3.post1
PyYAML==6.0.1
pyzmq==25.1.1
ray==2.8.0
referencing==0.31.0
regex==2023.10.3
requests==2.31.0
rich==13.7.0
rpds-py==0.13.1
safetensors==0.4.0
schema==0.7.5
scipy==1.11.4
sentencepiece==0.1.99
simple-di==0.1.5
six==1.16.0
sniffio==1.3.0
soupsieve==2.5
SQLAlchemy==2.0.23
starlette==0.27.0
sympy==1.12
tenacity==8.2.3
tokenizers==0.15.0
torch==2.1.0
tornado==6.3.3
tqdm==4.66.1
transformers==4.35.2
triton==2.1.0
typing-inspect==0.9.0
typing_extensions==4.8.0
tzdata==2023.3
urllib3==2.1.0
uvicorn==0.24.0.post1
uvloop==0.19.0
virtualenv==20.24.7
vllm==0.2.2
watchfiles==0.21.0
websockets==12.0
wrapt==1.16.0
xformers==0.0.22.post7
xxhash==3.4.1
yarl==1.9.3
zipp==3.17.0

Who can help?

No response

Information

  • The official example notebooks/scripts
  • My own modified scripts

Related Components

  • LLMs/Chat Models
  • Embedding Models
  • Prompts / Prompt Templates / Prompt Selectors
  • Output Parsers
  • Document Loaders
  • Vector Stores / Retrievers
  • Memory
  • Agents / Agent Executors
  • Tools / Toolkits
  • Chains
  • Callbacks/Tracing
  • Async

Reproduction

I am following the example and wrote the code:

llm = OpenLLM(server_url=server_url, server_type='http')
llm("What is the difference between a duck and a goose? And why there are so many Goose in Canada?")

Seems like First request is hitting the server

(scheme=http,method=POST,path=/v1/metadata,type=application/json,length=2) (status=200

Till now its look promising, but then I am getting error TypeError: 'dict' object is not callable.

As attached in the trace:

Traceback (most recent call last):
  File "/home/lolevsky/Github/Zodiac/main.py", line 24, in <module>
    run_zodiac()
  File "/home/lolevsky/Github/Zodiac/main.py", line 9, in run_zodiac
    resA = llm("What is the difference between a duck and a goose? And why there are so many Goose in Canada?")
  File "/usr/local/lib/python3.10/dist-packages/langchain/llms/base.py", line 876, in __call__
    self.generate(
  File "/usr/local/lib/python3.10/dist-packages/langchain/llms/base.py", line 626, in generate
    params = self.dict()
  File "/usr/local/lib/python3.10/dist-packages/langchain/llms/base.py", line 974, in dict
    starter_dict = dict(self._identifying_params)
  File "/usr/local/lib/python3.10/dist-packages/langchain/llms/openllm.py", line 220, in _identifying_params
    self.llm_kwargs.update(self._client._config())
TypeError: 'dict' object is not callable

To reproduce

This is how I had setup the envirment:

  • conda create --name openllm python=3.11
  • conda activate openllm
  • pip install openllm
  • pip install langchain

Expected behavior

Should not get errors, should hit the server for prompting

Copy link

dosubot bot commented Nov 26, 2023

🤖

Hello @lolevsky,

I hope you're doing well! I'm Dosu, an AI bot here to assist you with your issue and answer any questions you might have. While we wait for a human maintainer, feel free to ask me anything about LangChain. I'm here to make your experience smoother!

Based on the traceback you provided, it seems like the error is originating from the dict() method in the OpenLLM class. The error message TypeError: 'dict' object is not callable typically occurs when you try to call a dictionary as if it were a function.

In your case, it appears that the _client._config() method is returning a dictionary, and the code is trying to call it like a function. This is likely the source of the error.

As for the solution, it seems that there was a similar issue encountered by another user in the LangChain repository. The issue titled Retrieval Question/Answering Example not working in 0.0.200 suggests that the _identifying_params method needs to be present in the custom LLM class. This method is mandatory in the newer versions of LangChain, greater than 0.0.164.

However, it's also worth noting that the OpenLLM class does not define a __call__ method, which allows instances of the class to be called as functions. If you're trying to use an instance of the OpenLLM class as a function, you'll encounter this error. Instead, you should use the _call_ or _acall_ method of the OpenLLM class to generate responses from the language model.

Here's an example of how you can do this:

from langchain.llms import OpenLLM
llm = OpenLLM(
    model_name='flan-t5',
    model_id='google/flan-t5-large',
)
response = llm._call("What is the difference between a duck and a goose?")
print(response)

In this example, the _call_ method of the OpenLLM instance llm is used to generate a response from the language model. The response is then printed to the console.

Please try this solution and let me know if it resolves your issue. If not, please provide more details about your implementation of the OpenLLM class and how you're using it in your code.

Sources

@dosubot dosubot bot added Ɑ: models Related to LLMs or chat model modules 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature labels Nov 26, 2023
@samin-batra
Copy link

I am getting the same issue. It seems the documentation is not updated as something has changed due to which, the OpenLLM constructor returns a dictionary.

@samin-batra
Copy link

Seems to be an issue here:
Screenshot 2024-01-07 at 4 22 37 PM

@HerzogVolpe
Copy link

I am also getting this issue.

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Apr 8, 2024
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Apr 15, 2024
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Apr 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: models Related to LLMs or chat model modules
Projects
None yet
Development

No branches or pull requests

3 participants