Replies: 3 comments 1 reply
-
🤖 Hello @lguillaud! I'm Dosu, a friendly bot here to help you navigate bugs, answer questions, and guide you on your journey as a contributor while we wait for a human maintainer. Delighted to meet you and looking forward to assisting you! Based on the information available in the LangChain repository, the You can specify a custom SSL certificate for a request by passing the path to the certificate file as the Here is an example of how you can use a custom SSL certificate with the requests_wrapper = RequestsWrapper()
response = requests_wrapper.get('https://example.com', cert='/path/to/certfile') In this example, the However, I wasn't able to find information on whether the If you have the SSL certificate file for your server, you could try passing it to the I hope this helps! If you have any other questions, feel free to ask. -Dosu SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
Thanks to the bot but it does not answer the question. Or if someone has an example of how to use an openapi specs to make calls with this very specific option verify=False ? |
Beta Was this translation helpful? Give feedback.
-
Hi @lguillaud, Can you try to update class RequestsGetToolWithParsing(BaseRequestsTool, BaseTool):
"""Requests GET tool with LLM-instructed extraction of truncated responses."""
name: str = "requests_get"
"""Tool name."""
description = REQUESTS_GET_TOOL_DESCRIPTION
"""Tool description."""
response_length: int = MAX_RESPONSE_LENGTH
"""Maximum length of the response to be returned."""
llm_chain: Any = Field(
default_factory=_get_default_llm_chain_factory(PARSING_GET_PROMPT)
)
"""LLMChain used to extract the response."""
def _run(self, text: str) -> str:
from langchain.output_parsers.json import parse_json_markdown
try:
data = parse_json_markdown(text)
except json.JSONDecodeError as e:
raise e
data_params = data.get("params")
response: str = cast(
str, self.requests_wrapper.get(data["url"], params=data_params, verify=false)
)
response = response[: self.response_length]
return self.llm_chain.predict(
response=response, instructions=data["output_instructions"]
).strip() I changed line 87 : response: str = cast(
str, self.requests_wrapper.get(data["url"], params=data_params, verify=false)
) Tell me if that fix the problem. If so, we'll need to create PR to fix it |
Beta Was this translation helpful? Give feedback.
-
Hello,
I am trying to use an openapi agent to make call to a custom API (served by https) but I keep getting this error message:
_SSLError: HTTPSConnectionPool(host='my-server', port=443): Max retries exceeded with url: /api/v1/kafka/topics (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (ssl.c:1129)'))).
I have tried to add requests_kwargs = {"verify": False} to the agent but no luck.
Here is the snippet of code I am using:
Does anyone have an idea on how tp specify a certificate and /or say insecure = yes ?
Thanks
Beta Was this translation helpful? Give feedback.
All reactions