Closed
Description
System Info
Verified in dockerimage python:3.9
Who can help?
Information
- The official example notebooks/scripts
- My own modified scripts
Related Components
- LLMs/Chat Models
- Embedding Models
- Prompts / Prompt Templates / Prompt Selectors
- Output Parsers
- Document Loaders
- Vector Stores / Retrievers
- Memory
- Agents / Agent Executors
- Tools / Toolkits
- Chains
- Callbacks/Tracing
- Async
Reproduction
how to reproduce?
- docker run -it python:3.9 bash
- pip install langchain
- pip install openai
- run script
from langchain.chat_models import ChatOpenAI
from langchain.prompts.chat import SystemMessage
chat_model = ChatOpenAI(n=3)
prompt = "What is the capital of France?"
message = SystemMessage(content=prompt)
responses = chat_model.predict_messages([message], n=3)
print(responses)
output:
content='The capital of France is Paris.' additional_kwargs={} example=False
Expected behavior
Hello,
When running the script I expect to get 3 responses, because I set the parameter n=3
during initialization of ChatOpenAI and in the call of predict_messages.
still the response is a single answer!
please let me know how to correctly use the parameter n or fix the current behavior!
best regards,
LW