Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError when setting up self hosted model + runhouse integration #1290

Closed
dcavadia opened this issue Feb 25, 2023 · 2 comments
Closed

Comments

@dcavadia
Copy link

Im having this bug when trying to setup a model within a lambda cloud running SelfHostedHuggingFaceLLM() after the rh.cluster() function.

`
from langchain.llms import SelfHostedPipeline, SelfHostedHuggingFaceLLM
from langchain import PromptTemplate, LLMChain
import runhouse as rh
gpu = rh.cluster(name="rh-a10", instance_type="A10:1").save()
template = """Question: {question}

Answer: Let's think step by step."""

prompt = PromptTemplate(template=template, input_variables=["question"])
llm = SelfHostedHuggingFaceLLM(model_id="gpt2", hardware=gpu, model_reqs=["pip:./", "transformers", "torch"])
`

image

I made sure with sky check that the lambda credentials are set, but the error i get within the log is this, which i havent been able to solve.

image

If i can get any help solving this i would appreciate it.

@dongreenberg
Copy link
Contributor

Being discussed in run-house/runhouse#9, we can close this.

@dosubot
Copy link

dosubot bot commented Aug 24, 2023

Hi, @dcavadia! I'm Dosu, and I'm here to help the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.

Based on my understanding of the issue, you encountered a runtime error when setting up a self-hosted model within a lambda cloud running SelfHostedHuggingFaceLLM() after the rh.cluster() function. You provided code snippets and error screenshots, and there was a suggestion from dongreenberg to discuss the issue in another GitHub thread, which you seemed to agree with.

Before we proceed, we would like to confirm if this issue is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on this issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.

Thank you for your understanding, and we look forward to hearing from you soon.

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Aug 24, 2023
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Sep 10, 2023
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Sep 10, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants