Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why is there a \n\n prepended to the response? #169

Closed
weiying-chen opened this issue Jan 1, 2024 · 2 comments
Closed

Why is there a \n\n prepended to the response? #169

weiying-chen opened this issue Jan 1, 2024 · 2 comments
Labels
help wanted Extra attention is needed question Further information is requested

Comments

@weiying-chen
Copy link

weiying-chen commented Jan 1, 2024

Not an issue, really. I'm just curious, why is there a \n\n at the beginning of the response?

text: "\n\nQ: What did the universe say to the universe when it tripped over?
@64bit 64bit added question Further information is requested help wanted Extra attention is needed labels Jan 3, 2024
@frankfralick
Copy link
Contributor

Did you get that response using the legacy "Completions" api or the "Chat Completions" api? Before they went to the chat completions api, the whole interaction was sort of a long string. If I go to the playground and choose the legacy completions api, and write a prompt, and I make sure that I just enter my prompt and don't include any newline characters of my own, you will see that it is going to prepend two newline characters before the text. If you reset it and type the same prompt and hit enter twice so there is a blank line above the cursor, and then submit the request, the response text will start right where the cursor is (again).

It is predicting next tokens, so if its training data included a lot of prompt/completion pairs where the completion starts with newlines, that is what you will see. If you end without newlines, it thinks the next token needs to be the double newline token. If your prompt ends with a single newline character, it is going to prepend a single newline. You can try this yourself: 0, 1, or 2 newlines after your prompt should yield the same spacing between the prompt and response text under the older models.

image

Another way to see the difference clearly is to go in the legacy playground and give it a very incomplete prompt. You will see it carries on right where you stopped. In the new api's they've stopped that madness.

Old:
image

New:
image

@64bit
Copy link
Owner

64bit commented Mar 2, 2024

Thank you for nice detailed explanation with pictures, think we can close this now.

@64bit 64bit closed this as completed Mar 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants