Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Output cutoff with ChatOpenAI #1652

Closed
Anil-matcha opened this issue Mar 14, 2023 · 6 comments · Fixed by #1782
Closed

Output cutoff with ChatOpenAI #1652

Anil-matcha opened this issue Mar 14, 2023 · 6 comments · Fixed by #1782

Comments

@Anil-matcha
Copy link

With the newly released ChatOpenAI model, the completion output is being cut off randomly in between

For example I used the below input

Write me an essay on Pune

I got this output

Pune, also known as Poona, is a city located in the western Indian state of Maharashtra. It is the second-largest city in the state and is often referred to as the "Oxford of the East" due to its reputation as a center of education and research. Pune is a vibrant city with a rich history, diverse culture, and a thriving economy.\n\nThe history of Pune dates back to the 8th century when it was founded by the Rashtrakuta dynasty. Over the centuries, it has been ruled by various dynasties, including the Marathas, the Peshwas, and the British. Pune played a significant role in India's struggle for independence, and many freedom fighters, including Mahatma Gandhi, spent time in the city.\n\nToday, Pune is a bustling metropolis with a population of over 3 million people. It is home to some of the most prestigious educational institutions in India, including the University of Pune, the Indian Institute of Science Education and Research, and the National Defense Academy. The city is also a hub for research and development, with many multinational companies setting up their research centers in Pune.\n\nPune is a city of contrasts, with modern skyscrapers standing alongside ancient temples and historical landmarks. The city's

As you can see the message is cutoff in between. I followed the official documentation from here https://github.com/hwchase17/langchain/blob/master/docs/modules/chat/getting_started.ipynb

This was not the issue before with OpenAIChat but with ChatOpenAI this is posing an issue

@sbc-max
Copy link

sbc-max commented Mar 14, 2023

I believe the max_token is by default set to 256 in ChatOpenAI (where it wasn't set in OpenAIChat).

You can try adjusting parameter when you initialize the llm

chat = ChatOpenAI(temperature=0, max_tokens = 2056)

There is an issue open to allow for passing -1 to default to max tokens: #1532

@Anil-matcha
Copy link
Author

@sbc-max What is the max token limit which can be passed to ChatOpenAI ? It is 2056 ?

@sbc-max
Copy link

sbc-max commented Mar 14, 2023

@Anil-matcha
Copy link
Author

@sbc-max Thanks for the reference, should I close this issue ?

@huerlisi
Copy link

Also experienced this. Adding the max tokens fixed the problem.

I guess adding some words to the getting started tutorial would help newbies.

@ahmed-bhs
Copy link

When you set it to chat = ChatOpenAI(temperature=0, max_tokens = 2056)

you will get kind of exception : This model's maximum context length is 4097 tokens. However, you requested 6005 tokens (3949 in the messages, 2056 in the completion)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants