Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Limited Input/Output length seriously reduces Poe's usefulness compared to ChatGPT #76

Closed
ibehnam opened this issue Feb 6, 2024 · 6 comments

Comments

@ibehnam
Copy link

ibehnam commented Feb 6, 2024

Recently I've been trying to create an advanced academic bot on the platform which depends on GPT-4, ChatGPT, and Mixtral 8×7b-Chat models. All was working well until I found out that there's an input limit to bots on Poe, which means I can't send long pieces of text to bots through my bot. I thought maybe the problem was with my bot, so I tried to manually use GPT-4 on Poe and send the text. I got the same result: GPT-4 says this (screenshot):

image

This is odd because I can easily send the same piece of text on the ChatGPT website and it accepts it. Not only that, I can also generate much longer pieces of text on the ChatGPT website whereas on Poe, there seems to be a length limit. This restriction essentially means my bot can't work properly on Poe.

To make things worse, I also noticed that this limitation seriously deteriorates the quality of the "function calling" feature on Poe bots (https://developer.poe.com/server-bots/using-openai-function-calling). When the input length of a bot gets restricted, it also loses its ability to accurately follow the JSON schema of the function inputs (I even tried simplifying my JSON schema a lot but to no avail). The end result is that my bot on Poe hallucinates much more than it does if I were using the OpenAI's API directly, and makes up JSON fields that don't exist in the schema...

I like Poe and have advertised it to my friends, which is why I hope you consider fixing these problems. My project is time critical and I'm afraid without having access to the actual GPT-4 context window, I would have to use something else like ChatGPT's GPT store. I really prefer to use Poe though because of the flexibility you have provided through your API and good documentation. If you have any questions, please feel free to ask!

@anmolsingh95
Copy link
Contributor

Hi Behnam. Just wanted to acknowledge that I saw your message and share that I filed an internal ticket to investigate this. In the meanwhile, will it be possible for you to provide a snippet of text that allows us to reproduce this? Thanks!

@anmolsingh95
Copy link
Contributor

Hi @ibehnam. Unfortunately, it will take longer for us to make the higher context models available. In the meanwhile, is it possible for you to try out GPT-32K to see if that works? https://poe.com/GPT-4-32k

@ibehnam
Copy link
Author

ibehnam commented Feb 8, 2024

Hi Behnam. Just wanted to acknowledge that I saw your message and share that I filed an internal ticket to investigate this. In the meanwhile, will it be possible for you to provide a snippet of text that allows us to reproduce this? Thanks!

Sure! Here's one example: https://poe.com/s/s8bzSuskErMYCeufvkOB

In comparison, ChatGPT handles the exact same prompt with no problem:
https://chat.openai.com/share/31312be8-38d8-4f82-b890-6dec81359136

@ibehnam
Copy link
Author

ibehnam commented Feb 8, 2024

Hi @ibehnam. Unfortunately, it will take longer for us to make the higher context models available. In the meanwhile, is it possible for you to try out GPT-32K to see if that works? poe.com/GPT-4-32k

Thanks for offering this, but unfortunately the limit is not just about context window, it's about the length of text that the user is allowed to write in their messages (example above).

When Poe silently switched from gpt-4 to gpt-4-turbo on its official GPT-4 bot, I was a bit disappointed because gpt-4-turbo is a downgrade in terms of reasoning skills and quality of messages. But since it has a much longer context window (128k!), I decided to use it for my academic bot. I don't even know why Poe still serves the GPT-4-32k bot. It was always limited for me (50 messages) and although it's got higher quality compared to gpt-4 and gpt-4-turbo, limited allowed number of messages for this bot makes it unusable in practice. Not to mention that all these Poe bots (based on gpt-4, gpt-4-turbo, and gpt-4-32k) suffer from the limited input and output length, and it messes with their function calling too.

image image

@anmolsingh95
Copy link
Contributor

Hi @ibehnam. I raised this internally but seems like it will take a bigger change to allow messages longer than the model's total context window and will likely not get addressed in a timeframe in which it will be helpful to you. Apologies for the inconvenience this caused on your end.

@ibehnam
Copy link
Author

ibehnam commented Feb 17, 2024

@anmolsingh95 Thanks for the update. Just to be clear, I wasn't asking for context window longer than the model's total context window. As I said, currently Poe doesn't allow us to send inputs that are longer than a specific length n, even if n < model_contex_window. GPT-4-turbo's context window is 128K, that is almost 300 pages. But we can't send more than a couple pages content in the messages (either through the API or on the website).

I was hoping that this would get resolved but I understand that Poe may want to save API costs by limiting input/output lengths.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants