You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I searched the LangChain documentation with the integrated search.
I used the GitHub search to find a similar question and didn't find it.
I am sure that this is a bug in LangChain rather than my code.
The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
Example Code
!pipinstalllangchain_openailangchain-corelangchain-mistralai-qUfromtypingimportOptionalfromlangchain_core.pydantic_v1importBaseModel, FieldclassPerson(BaseModel):
"""Information about a person."""# ^ Doc-string for the entity Person.# This doc-string is sent to the LLM as the description of the schema Person,# and it can help to improve extraction results.# Note that:# 1. Each field is an `optional` -- this allows the model to decline to extract it!# 2. Each field has a `description` -- this description is used by the LLM.# Having a good description can help improve extraction results.name: Optional[str] =Field(default=None, description="The name of the person")
hair_color: Optional[str] =Field(
default=None, description="The color of the peron's hair if known"
)
height_in_meters: Optional[str] =Field(
default=None, description="Height measured in meters"
)
fromtypingimportOptionalfromlangchain_core.promptsimportChatPromptTemplate, MessagesPlaceholderfromlangchain_core.pydantic_v1importBaseModel, Fieldfromlangchain_openaiimportChatOpenAI# Define a custom prompt to provide instructions and any additional context.# 1) You can add examples into the prompt template to improve extraction quality# 2) Introduce additional parameters to take context into account (e.g., include metadata# about the document from which the text was extracted.)prompt=ChatPromptTemplate.from_messages(
[
(
"system",
"You are an expert extraction algorithm. ""Only extract relevant information from the text. ""If you do not know the value of an attribute asked to extract, ""return null for the attribute's value.",
),
# Please see the how-to about improving performance with# reference examples.# MessagesPlaceholder('examples'),
("human", "{text}"),
]
)
fromlangchain_mistralaiimportChatMistralAIllm=ChatMistralAI(model="mistral-large-latest", temperature=0)
runnable=prompt|llm.with_structured_output(schema=Person)
text="Alan Smith is 6 feet tall and has blond hair."runnable.invoke({"text": text})
Error Message and Stack Trace (if applicable)
---------------------------------------------------------------------------
LocalProtocolError Traceback (most recent call last)
[/usr/local/lib/python3.10/dist-packages/httpx/_transports/default.py](https://localhost:8080/#) in map_httpcore_exceptions()
68 try:
---> 69 yield
70 except Exception as exc:
28 frames
LocalProtocolError: Illegal header value b'Bearer '
The above exception was the direct cause of the following exception:
LocalProtocolError Traceback (most recent call last)
[/usr/local/lib/python3.10/dist-packages/httpx/_transports/default.py](https://localhost:8080/#) in map_httpcore_exceptions()
84
85 message = str(exc)
---> 86 raise mapped_exc(message) from exc
87
88
LocalProtocolError: Illegal header value b'Bearer '
System Information
------------------
> OS: Linux
> OS Version: #1 SMP PREEMPT_DYNAMIC Sat Nov 18 15:31:17 UTC 2023
> Python Version: 3.10.12 (main, Nov 20 2023, 15:14:05) [GCC 11.4.0]
Package Information
-------------------
> langchain_core: 0.1.50
> langsmith: 0.1.53
> langchain_mistralai: 0.1.6
> langchain_openai: 0.1.6
Packages not installed (Not Necessarily a Problem)
--------------------------------------------------
The following packages were not found:
> langgraph
> langserve
The text was updated successfully, but these errors were encountered:
dosubotbot
added
🔌: openai
Primarily related to OpenAI integrations
🤖:bug
Related to a bug, vulnerability, unexpected error with an existing feature
labels
May 3, 2024
You're missing the API key, and you'll need to read it using dotenv. If you're using Mistral, create the API key in your Mistral account. If you're using OpenAI, generate the API key there instead.
It's also part of the documentation:
# Set env vars for the relevant model or load from a .env file:
import dotenv
dotenv.load_dotenv()
On Sat, May 4, 2024 at 00:04 David ***@***.***> wrote:
You're missing the API key, and you'll need to read it using dotenv. If
you're using Mistral, create the API key in your Mistral account. If you're
using OpenAI, generate the API key there instead.
It's also part of the documentation:
# Set env vars for the relevant model or load from a .env file:
import dotenv
dotenv.load_dotenv()
—
Reply to this email directly, view it on GitHub
<#21261 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AE4LJNMY7GU5D3PXYZSMSA3ZAPKD7AVCNFSM6AAAAABHF4DHHCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAOJTGU2TGNZWGU>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
Description
Trying https://python.langchain.com/docs/use_cases/extraction/quickstart/
System Info
The text was updated successfully, but these errors were encountered: