-
Notifications
You must be signed in to change notification settings - Fork 5.1k
Issues: OpenInterpreter/open-interpreter
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
"cmd" disabled or not supported
Bug
Something isn't working
Good First issue
Good for newcomers
#1023
opened Feb 17, 2024 by
Notnaton
Documentation for Hosted Provider > Ollama
Enhancement
New feature or request
#1032
opened Feb 22, 2024 by
yousecjoe
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xff in position 0: invalid start byte
Bug
Something isn't working
#1049
opened Mar 1, 2024 by
Politas380
Not setting interpreter to custom model
Bug
Something isn't working
#1051
opened Mar 2, 2024 by
PontiacGTX
Low tolerance to network connection issues
Bug
Something isn't working
#1071
opened Mar 12, 2024 by
shuther
[IPKernelApp] CRITICAL
error if using at Google Colab
Bug
#1111
opened Mar 22, 2024 by
hacker-hackman
Show total response latency in verbose mode
Enhancement
New feature or request
#1133
opened Mar 26, 2024 by
gitpushoriginmaster
pip --upgrade didn't work, turned working open-interpreter package into non-working package (possible fixes inside)
Bug
Something isn't working
#1134
opened Mar 26, 2024 by
AncalagonX
Split docs into their own repo?
Enhancement
New feature or request
#1138
opened Mar 27, 2024 by
wolfspyre
[docs] link default profiles exemplars from profiles docs page
Enhancement
New feature or request
#1139
opened Mar 27, 2024 by
wolfspyre
Tried to install LM Studio on Window and it crashed installation and now open-interpreter crash to.
#1155
opened Mar 30, 2024 by
onigetoc
feature request: git-based guardrails around filesystem changes
#1162
opened Apr 2, 2024 by
matwerber1
make without api-key possible &run llm command output directly in local console.
#1169
opened Apr 3, 2024 by
hgftrdw45ud67is8o89
FastAPI Server: out put of StreamingResponse for run code part
#1189
opened Apr 9, 2024 by
13293824182
Previous Next
ProTip!
Type g p on any issue or pull request to go back to the pull request listing page.