Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handle AI streaming #2

Open
steven-tey opened this issue Sep 2, 2023 · 10 comments
Open

Handle AI streaming #2

steven-tey opened this issue Sep 2, 2023 · 10 comments

Comments

@steven-tey
Copy link

Would it be possible to hook this up to Novel's streaming endpoint (https://novel.sh/api/generate)? Would it handle streaming inside the Editor?

Alternatively, we could also enable Copilot autocompletes with this instead?

@BennyKok
Copy link
Owner

BennyKok commented Sep 3, 2023

I bump into some werid issues when trying to directly change the completionApi then i end up trying changing that directly in novel, seems like its still hitting the default endpoint, I'm going take a few more looks into that, will report back if its an issues elsewhere!

CleanShot 2023-09-04 at 00 09 10

CleanShot 2023-09-04 at 00 08 51@2x

@BennyKok
Copy link
Owner

BennyKok commented Sep 3, 2023

Now i see!

The "Continue writing" is using hardcoded /api/generate
While the ++ auto-completion is using completionApi props in NovelEditor

Might have to fix it at novel's end.

However, by directly calling to https://novel.sh/api/generate there will be cors issue as well.

I will take a look at if there's possible to setup <> copilot.

Another option will be setting up a local vercel ai client in vscode extension, that allow users to set their own openai API key.

@jt6677
Copy link

jt6677 commented Sep 3, 2023

It should be two calls; first option then post. Using my own api, the default useCompletion does not seem to handle the options correctly in vscode extention mode. It would only send option but not the post call. I had to build a hook with axios.

@jt6677
Copy link

jt6677 commented Sep 4, 2023

Code_xc3owREKWY.mp4

It took me a couple of hours to get this. basically, I used Novel as a component instead of as a package, used a axios hook instead of ```useCompletion``

@jt6677
Copy link

jt6677 commented Sep 4, 2023

export const useAxiosCompletion = () => {
  const [isLoading, setIsLoading] = useState(false);
  const [completion, setCompletion] = useState<string>("");
  const headers = {
    "Access-Control-Allow-Origin": "*",
    "Access-Control-Allow-Credentials": "true",
    "Authorization": `Bearer ${authorization}`,
  };

  const complete = async () => {
    setIsLoading(true);
    try {
      const response = await axios.post(api, { headers });
      if (response.status === 429) {
        setIsLoading(false);
        return;
      }
      const respText = response.data.choices[0].message.content;
      setCompletion(respText);
    } catch (error) {
      console.log("error", error);
    }
    setIsLoading(false);
  };

  const stop = () => {
    // Implement any stop logic if necessary
  };

  return { complete, completion, isLoading, stop };
};

@BennyKok
Copy link
Owner

BennyKok commented Sep 4, 2023

@jt6677 so you modified the code inside novel-vscode or in a separate project for testing? is the API directly calling https://novel.sh/api/generate?

@jt6677
Copy link

jt6677 commented Sep 4, 2023

I basically copied novel editor from node_moudles to src directory so I can modify how it handle api calls. I use my own end point which is basically the same as https://novel.sh/api/generate. It is an edge function just like https://novel.sh/api/generate.

@jt6677
Copy link

jt6677 commented Sep 4, 2023

A lot of work still need to be done to parse the markdown correctly. Basically, cannot take openai response directly. We need to regex to determine what format it is (text, bulletpoint, checkbox or headline), sanitize the markdown format and parse them as respective tiptap node types.

Code_mSkCwoknx7.mp4

@BennyKok
Copy link
Owner

BennyKok commented Sep 5, 2023

Yes, and this also came across if we have a table or other md features supported, it has to take care of a lot more edge cases.
We might have to carefully prompt with the current context of where the user's cursor is at.

etc

  • Inside a table
  • in a list
  • in the main body
  • inside the code tag

@andredezzy
Copy link

Completion is not working yet, any solution for this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants