New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Handle AI streaming #2
Comments
Now i see! The "Continue writing" is using hardcoded Might have to fix it at novel's end. However, by directly calling to I will take a look at if there's possible to setup <> copilot. Another option will be setting up a local vercel ai client in vscode extension, that allow users to set their own openai API key. |
It should be two calls; first option then post. Using my own api, the default useCompletion does not seem to handle the options correctly in vscode extention mode. It would only send option but not the post call. I had to build a hook with axios. |
Code_xc3owREKWY.mp4It took me a couple of hours to get this. basically, I used Novel as a component instead of as a package, used a axios hook instead of ```useCompletion`` |
export const useAxiosCompletion = () => {
const [isLoading, setIsLoading] = useState(false);
const [completion, setCompletion] = useState<string>("");
const headers = {
"Access-Control-Allow-Origin": "*",
"Access-Control-Allow-Credentials": "true",
"Authorization": `Bearer ${authorization}`,
};
const complete = async () => {
setIsLoading(true);
try {
const response = await axios.post(api, { headers });
if (response.status === 429) {
setIsLoading(false);
return;
}
const respText = response.data.choices[0].message.content;
setCompletion(respText);
} catch (error) {
console.log("error", error);
}
setIsLoading(false);
};
const stop = () => {
// Implement any stop logic if necessary
};
return { complete, completion, isLoading, stop };
}; |
@jt6677 so you modified the code inside novel-vscode or in a separate project for testing? is the API directly calling |
I basically copied novel editor from node_moudles to src directory so I can modify how it handle api calls. I use my own end point which is basically the same as https://novel.sh/api/generate. It is an edge function just like https://novel.sh/api/generate. |
A lot of work still need to be done to parse the markdown correctly. Basically, cannot take openai response directly. We need to regex to determine what format it is (text, bulletpoint, checkbox or headline), sanitize the markdown format and parse them as respective tiptap node types. Code_mSkCwoknx7.mp4 |
Yes, and this also came across if we have a table or other md features supported, it has to take care of a lot more edge cases. etc
|
Completion is not working yet, any solution for this? |
Would it be possible to hook this up to Novel's streaming endpoint (https://novel.sh/api/generate)? Would it handle streaming inside the Editor?
Alternatively, we could also enable Copilot autocompletes with this instead?
The text was updated successfully, but these errors were encountered: