Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some weirdness going on when chatGPT is outputting code blocks and/or [[ #15

Closed
tophee opened this issue Mar 13, 2023 · 6 comments
Closed
Labels
bug Something isn't working

Comments

@tophee
Copy link

tophee commented Mar 13, 2023

I'm not sure what is going on, but I think it might be related to [[ being output in stream mode and/or within a code block. I guess they need to be escaped somehow. See here: https://cln.sh/zkG7Zmsj
Hope this is useful.

@tophee tophee changed the title Some weirdness going on when chatGPT is outputting code blocks Some weirdness going on when chatGPT is outputting code blocks and/or [[ Mar 13, 2023
@bramses
Copy link
Owner

bramses commented Mar 13, 2023

Hi @tophee,

Yes, this is because the plugin is writing directly to Obsidian's editor when stream is on (see https://github.com/bramses/chatgpt-md/blob/master/main.ts#L164-L170), as if you were manually "typing" the characters really fast.

In fact, I had to artificially slow the backtick character streaming in because Obsidian's editor processes them slightly slower than other characters (idk why). So what you're seeing is just the name of the game, unfortunately.

If it bothers you, you can set stream to false and it should load in as one large block. Does that make sense?

@tophee
Copy link
Author

tophee commented Mar 13, 2023

You mean it’s a feature, not a bug?

In any case, the first thing I tried was to turn stream off but that didn’t stop it from streaming. Then i saw that it was still set to true in Default Chat Frontmatter so I changed it there too, but it still keeps streaming…

@bramses
Copy link
Owner

bramses commented Mar 13, 2023

@tophee you have to set it to false, the default in the system is stream: true.

EDIT: just checked, there seems to be a bug somewhere in the logic, I'll mark it for next release

You mean it’s a feature, not a bug?

haha, yeah, I suppose

@bramses bramses added the bug Something isn't working label Mar 15, 2023
@bramses
Copy link
Owner

bramses commented Mar 15, 2023

@bramses bramses closed this as completed Mar 15, 2023
@tophee
Copy link
Author

tophee commented Mar 17, 2023

Thanks for fixing this! But may I ask why you consider the behaviour with square brackets as a feature?

As I think about it, maybe I misunderstood you and it's not so much a feature but a the natural behaviour when streaming text into obsidian? In that case: are you planning to make it possible to turn this off? I'm not sure what kind of effort is required for this in terms of code, but I'd imagine it to be possible to somehow temporarily deactivate things like auto-completion (or things like that) during streaming, no?

BTW: do you know whether what we see in streaming mode is the as it is being produced more or less in real-time (which would mean that streaming provides faster responses) or is the stream fake in the sense that the entire answer is already produced before the streaming starts (which would mean that turning streaming mode off will not slow down responses).

@bramses
Copy link
Owner

bramses commented Mar 17, 2023

In that case: are you planning to make it possible to turn this off? I'm not sure what kind of effort is required for this in terms of code, but I'd imagine it to be possible to somehow temporarily deactivate things like auto-completion (or things like that) during streaming, no?

I meant feature as in it's built into Obsidian. ChatGPT MD has no control over how Obsidian writes to its own editor (https://github.com/obsidianmd/obsidian-api/blob/master/obsidian.d.ts#L902). It merely takes data from the OpenAI response and appends it to the editor. Anything lower level would probably break CodeMirror or cause some other unforeseen issue.

Edit: That being said, if you do find a solution, I'd be happy to accept it, please feel free to PR!

BTW: do you know whether what we see in streaming mode is the as it is being produced more or less in real-time (which would mean that streaming provides faster responses) or is the stream fake in the sense that the entire answer is already produced before the streaming starts (which would mean that turning streaming mode off will not slow down responses).

As of now, the stream is fake, yes. I'm looking into an Event Source patch but that may or may not work, idk yet

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants