Skip to content
This repository has been archived by the owner on Sep 30, 2023. It is now read-only.

I am new and Confused about the correct step for the section of "getting started" #17

Closed
RookyCoder opened this issue Apr 20, 2023 · 2 comments

Comments

@RookyCoder
Copy link

RookyCoder commented Apr 20, 2023

I am thrilled to see this local AI coding assistant but I am unsure some of the installation step and Your help is appreicated.

image

1.)Clone a copy of "main" of "turbopilot" as item no. I shown in the above figure.

2.) I go for Option A: Download pre-quantized models and paste it to the models folder in main, i choose 6B ~4GB as item no.II and it looks like the same model in the example run command in step 4
image

3.) Download binary of TurboPilot Server and extract it to the root project folder.
image

4.) ./codegen-serve -t 6 -m ./models/codegen-6B-multi-ggml-4bit-quant.bin

I want to use it in VS Code with FauxPilot Plugin, also i am new to this too.
At the time of writing, it seems Venthe has accepted the PR so no need to install the "fauxpilot-1.1.5-ravenscroft.vsix" locally. Instead, i can download the plugin directly in vs code extension market.

  1. After install "fauxpilot Plug-in", Ctrl+Shift+P - Preferences: Open User Settings (JSON)
    image

================================= @ravenscroftj @
Q1: The suitable path of turbopilot-main ? Somewhere in Vscode folder such as installation path of fauxpilot?or Doesn't matter ?

Q2. Adminstrator mode of Cmd: execute step 4? Has to be the root project folder path?
image

image

Same go to Q3 where to call it?
C:\ A \ B \Downloads\TurbopilotALL\turbopilot-main\turbopilot-main

Q3. difficulties about the calling the API directly? I have no idea about these mysterious code or command. But i assume that it can be called in web browser URL and talks to the recipient in root project folder?

Running the requests in admin cmd http://localhost:18080/v1/engines/codegen/cohttp://localhost:18080/v1/engines/codegen/completions mpletions

Q4. Any confliction between other ai prompt offers similar feature or intelliSense that build in function in vs code?

Thanks

@ravenscroftj
Copy link
Owner

Hi @RookyCoder

I think the confusion is probably because I currently don't ship a windows version of turbocoder - I don't have a windows machine and haven't had time set up builds for windows - I might have a look over the weekend as it should be pretty straight forward given that llama.cpp does support windows.

When I add a codegen-serve.exe you'll just need to run that with the appropriate model name passed in.

You can call the API directly using any old HTTP client but I wouldn't recommend it as a beginner, you might as well stick to using Fauxpilot or similar.

There should not be any conflict between turbocoder and other features - Fauxpilot for VSCode uses the official VSCode extension points so it will complement other systems. I don't know if trying to use Fauxpilot and the official Copilot plugin at the same time would introduce some weirdness so I probably wouldn't recommend that.

I'll reply here again once I get windows builds working - although I don't have a windows machine so I'll have to rely on the community testing them for now.

@ravenscroftj
Copy link
Owner

I'm going to close this issue for now. Follow #24 for status updates on windows support.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants