Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does Not work On wsl Ubuntu 20.04 #95

Open
repo-gardener opened this issue Sep 30, 2023 · 13 comments
Open

Does Not work On wsl Ubuntu 20.04 #95

repo-gardener opened this issue Sep 30, 2023 · 13 comments
Labels
help wanted Extra attention is needed windows_wsl

Comments

@repo-gardener
Copy link

It downloads with pip install og_up don't get me wrong. The chat window is just mangled and no output shows up

@imotai
Copy link
Contributor

imotai commented Sep 30, 2023

which model do you choose?

@repo-gardener
Copy link
Author

At first I left it blank because I thought (gpt3-turbo-16k) was default. Then I just loaded it up and entered gpt-3.5 as the model. I still have the empty chat window. I also copied my api key with Ctrl-v. hopefully it was pasting it in there.

@imotai
Copy link
Contributor

imotai commented Sep 30, 2023

Can you give me the output of the command og_up?

@imotai imotai added help wanted Extra attention is needed windows_wsl labels Sep 30, 2023
@repo-gardener
Copy link
Author

image

@imotai
Copy link
Contributor

imotai commented Sep 30, 2023

Great. Next step you can check the logs

# 1. get the container id
docker ps  | grep octogen
# 2. enter the docker container
docker exec -it cointainer_id bash
# 3. check the agent log
hap logs 3

Maybe there are some traceback, you can give them to me

@imotai
Copy link
Contributor

imotai commented Sep 30, 2023

image
the following are valid model names

gpt-3.5-turbo
gpt-3.5-turbo-0301
gpt-3.5-turbo-0613
gpt-3.5-turbo-16k
gpt-3.5-turbo-16k-0613
gpt-3.5-turbo-instruct
gpt-3.5-turbo-instruct-0914
I think I shoud add a validation to check the model name

@repo-gardener
Copy link
Author

repo-gardener commented Sep 30, 2023 via email

@repo-gardener
Copy link
Author

image

@jackfood
Copy link

jackfood commented Oct 1, 2023

also having the same issue.
image

image

@imotai
Copy link
Contributor

imotai commented Oct 1, 2023

also having the same issue. image

image

I will try codellama on my windows. thanks for the feedback.

@imotai
Copy link
Contributor

imotai commented Oct 2, 2023

@jackfood @NarcoticNarcoleptic I did not find the same issue on my windows. Maybe there are some environmental differences.
I have updated the readme for local development without docker. You can have a try.

@repo-gardener
Copy link
Author

I appreciate all the response back to this issue yall. I tried installing in a conda env yesterday but still had the same issues. I will update you if I get it fixed. I also did not properly check the docker logs with the commands mentioned above. So I will provide those this week

@imotai
Copy link
Contributor

imotai commented Oct 3, 2023

image

Maybe the docker version did not match the requirement. the octogen need the the docker to support json output

docker ps --help | grep json
                        'json':             Print in JSON format

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed windows_wsl
Projects
None yet
Development

No branches or pull requests

3 participants