Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not able to launch ui with local model providers. #556

Closed
sangee2004 opened this issue Jun 26, 2024 · 2 comments
Closed

Not able to launch ui with local model providers. #556

sangee2004 opened this issue Jun 26, 2024 · 2 comments
Labels
bug Something isn't working UI Issues relating to "--ui" option

Comments

@sangee2004
Copy link
Contributor

gptscript version v0.8.5-rc4+3033b05a

Steps to reproduce the problem:

  1. Set streaming to false - GPTSCRIPT_INTERNAL_OPENAI_STREAMING=false
  2. Launch UI using local model provider - gptscript --default-model 'Llama-3-8b-function-calling-alpha-v1.gguf from http://localhost:1234/v1' --ui github.com/gptscript-ai/cli-demo

UI fails to launch and gets stuck with output as "</"

gptscript --default-model 'Llama-3-8b-function-calling-alpha-v1.gguf from http://localhost:1234/v1' --ui github.com/gptscript-ai/cli-demo
16:41:49 WARNING: Changing the default model can have unknown behavior for existing tools. Use the model field per tool instead.
16:41:49 started  [main] [input=--file=github.com/gptscript-ai/cli-demo]
16:41:49 started  [context: github.com/gptscript-ai/context/os]
16:41:49 sent     [context: github.com/gptscript-ai/context/os]
16:41:49 ended    [context: github.com/gptscript-ai/context/os] [output=The local operating systems is Darwin, release 23.3.0]
16:41:50 sent     [main]
         content  [1] content | Waiting for model response...
         content  [2] content | The local operating systems is Darwin, release 23.3.0
         content  [2] content | 
16:42:00 started  [service(3)] [input={}]
16:42:00 launched [service][https://raw.githubusercontent.com/gptscript-ai/ui/c08908e223f67574a0a85cd8a98893ca413315ad/tool.gpt:service] port [10421] [/opt/homebrew/bin/gptscript sys.daemon /usr/bin/env npm run --prefix /Users/sangeethahariharan/Library/Caches/gptscript/repos/c08908e223f67574a0a85cd8a98893ca413315ad/tool.gpt/node21 dev]

> next-app-template@0.0.1 dev
> node server.mjs

> Socket server is ready at http://localhost:10421
 ○ Compiling / ...
 ✓ Compiled / in 2.2s (4300 modules)
 GET / 200 in 2441ms
16:42:04 ended    [service(3)] [output=\u003c!DOCTYPE html\u003e\u003chtml lang=\"en\"\u003e\u003chead\u003e\u003cmeta charSet=\"utf-8\"/\u003e\u003cmeta name=\"viewport\" content=\"width=dev...]
 POST / 200 in 19ms
16:42:04 continue [main]
16:42:04 started  [context: github.com/gptscript-ai/context/os]
16:42:04 sent     [context: github.com/gptscript-ai/context/os]
16:42:04 ended    [context: github.com/gptscript-ai/context/os] [output=The local operating systems is Darwin, release 23.3.0]
16:42:05 sent     [main]
         content  [1] content | Waiting for model response...
         content  [4] content | The local operating systems is Darwin, release 23.3.0
         content  [4] content | 
16:42:49 ended    [main] [output=\u003c/]
16:42:49 usage    [total=14145] [prompt=14126] [completion=19]

INPUT:

--file=github.com/gptscript-ai/cli-demo

OUTPUT:

</

When i try to open the UI at the port specified , I see the following errors in the console

 GET / 200 in 27ms
 POST / 500 in 23ms
 ⨯ actions/scripts/fetch.tsx (50:42) @ $$ACTION_3
 ⨯ Error: no files found in scripts directory
    at $$ACTION_3 (./actions/scripts/fetch.tsx:68:42)
digest: "2782177463"
  48 |         const gptFiles = files.filter(file => file.endsWith('.gpt'));
  49 |         
> 50 |         if (gptFiles.length === 0) throw new Error('no files found in scripts directory');
     |                                          ^
  51 |
  52 |         const gptscript = new GPTScript();
  53 |         const scripts: Record<string, string> = {};
 ⨯ actions/scripts/fetch.tsx (50:42) @ $$ACTION_3
 ⨯ Error: no files found in scripts directory
    at $$ACTION_3 (./actions/scripts/fetch.tsx:68:42)
digest: "2782177463"
  48 |         const gptFiles = files.filter(file => file.endsWith('.gpt'));
  49 |         
> 50 |         if (gptFiles.length === 0) throw new Error('no files found in scripts directory');
     |                                          ^
  51 |
  52 |         const gptscript = new GPTScript();
  53 |         const scripts: Record<string, string> = {};
 POST / 500 in 8ms

UI shows up as follows:
Screenshot 2024-06-25 at 5 41 18 PM

@sangee2004
Copy link
Contributor Author

With latest build from main v0.0.0-dev-53f7fbde-dirty , I see the tool call to service not succeed . This probably is because of GPTSCRIPT_INTERNAL_OPENAI_STREAMING set of false not taking effect.

% export GPTSCRIPT_INTERNAL_OPENAI_STREAMING=false
 % gptscript --default-model 'Llama-3-8b-function-calling-alpha-v1.gguf from http://localhost:1234/v1' --ui github.com/gptscript-ai/cli-demo
12:20:41 WARNING: Changing the default model can have unknown behavior for existing tools. Use the model field per tool instead.
12:20:42 started  [main] [input=--file=github.com/gptscript-ai/cli-demo]
12:20:42 started  [context: github.com/gptscript-ai/context/os]
12:20:42 sent     [context: github.com/gptscript-ai/context/os]
12:20:42 ended    [context: github.com/gptscript-ai/context/os] [output=The local operating systems is Darwin, release 23.3.0]
12:20:42 sent     [main]
         content  [2] content | The local operating systems is Darwin, release 23.3.0
         content  [2] content | 
12:20:42 ended    [main] [output=starttoolcall{\"name\": \"service\", \"arguments\": \"{}\"}endtoolcall]

INPUT:

--file=github.com/gptscript-ai/cli-demo

OUTPUT:

starttoolcall{"name": "service", "arguments": "{}"}endtoolcall

@sangee2004
Copy link
Contributor Author

The above reported issue is because of exceeding context window during tool execution required for launching UI.

I am able to use UI successfully when using local model with 32k context length . I tried it with Qwen2-7B-Instruct-function-calling-alpha-v1.gguf model with 32k context length.

With gptscript version - v0.0.0-dev-53f7fbde-dirty, able to launch ui with local provider successfully and able to have chat conversation using the local model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working UI Issues relating to "--ui" option
Projects
None yet
Development

No branches or pull requests

1 participant