Skip to content

Conversation

@jeffwang0516
Copy link
Contributor

Fix TypeError occured when using openllm.Runner

    lc_llm = OpenLLM(model_name=model_name, model_id=model_id, embedded=False, **attrs)
  File "/opt/conda/lib/python3.8/site-packages/langchain/llms/openllm.py", line 171, in __init__
    runner = openllm.Runner(
  File "/opt/conda/lib/python3.8/site-packages/openllm/_deprecated.py", line 58, in Runner
    model_id = attrs.get('model_id', default=os.getenv('OPENLLM_MODEL_ID', llm_config['default_id']))
TypeError: get() takes no keyword arguments

Copy link
Collaborator

@aarnphm aarnphm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good. Thanks

@aarnphm aarnphm merged commit af88b9b into bentoml:main Nov 15, 2023
@aarnphm
Copy link
Collaborator

aarnphm commented Nov 15, 2023

But there is a langchain PR upstream langchain-ai/langchain#12968 to use the new API

@jeffwang0516
Copy link
Contributor Author

That’s good to know. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants