You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
then attempt to run
python server.py --api --listen --listen-host 192.168.3.18 --listen-port 7860 --n-gpu-layers 24 --threads 8 --numa --tensorcores --trust-remote-code
Screenshot
No response
Logs
(textgen) [root@pve0 text-generation-webui]# python server.py --api --listen --listen-port 7860 --n-gpu-layers 24 --threads 8 --numa --tensorcores --trust-remote-code
18:06:59-333880 INFO Starting Text generation web UI
18:06:59-343693 WARNING trust_remote_code is enabled. This is dangerous.
18:06:59-344709 WARNING
You are potentially exposing the web UI to the entire internet without any access password.
You can create one with the "--gradio-auth" flag like this:
--gradio-auth username:password
Make sure to replace username:password with your own.
18:06:59-346542 INFO Loading settings from "settings.yaml"
18:06:59-359341 INFO Loading the extension "openai"
18:06:59-740049 INFO OpenAI-compatible API URL:
http://0.0.0.0:5000
18:06:59-743648 INFO Loading the extension "gallery"
Running on local URL: http://0.0.0.0:7860
Exception in ASGI application
Traceback (most recent call last):
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 411, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in __call__
return await self.app(scope, receive, send)
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in __call__
await super().__call__(scope, receive, send)
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/applications.py", line 123, in __call__
await self.middleware_stack(scope, receive, send)
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in __call__
raise exc
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in __call__
await self.app(scope, receive, _send)
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/gradio/route_utils.py", line 680, in __call__
await self.app(scope, receive, send)
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/routing.py", line 756, in __call__
await self.middleware_stack(scope, receive, send)
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/routing.py", line 776, in app
await route.handle(scope, receive, send)
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
await self.app(scope, receive, send)
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/routing.py", line 72, in app
response = await func(request)
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/fastapi/routing.py", line 278, in app
raw_response = await run_endpoint_function(
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/fastapi/routing.py", line 193, in run_endpoint_function
return await run_in_threadpool(dependant.call, **values)
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/concurrency.py", line 42, in run_in_threadpool
return await anyio.to_thread.run_sync(func, *args)
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2144, in run_sync_in_worker_thread
return await future
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 851, in run
result = context.run(func, *args)
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/gradio/routes.py", line 377, in main
return templates.TemplateResponse(
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/templating.py", line 229, in TemplateResponse
template = self.get_template(name)
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/templating.py", line 143, in get_template
return self.env.get_template(name)
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/jinja2/environment.py", line 1010, in get_template
return self._load_template(name, globals)
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/jinja2/environment.py", line 969, in _load_template
template = self.loader.load(self, name, self.make_globals(globals))
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/jinja2/loaders.py", line 125, in load
source, filename, uptodate = self.get_source(environment, name)
File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/jinja2/loaders.py", line 206, in get_source
with open(filename, encoding=self.encoding) as f:
File "/data/text-generation-webui/modules/block_requests.py", line 46, in my_open
file_contents = file_contents.replace(b'\t\t<script\n\t\t\tsrc="https://cdnjs.cloudflare.com/ajax/libs/iframe-resizer/4.3.9/iframeResizer.contentWindow.min.js"\n\t\t\tasync\n\t\t></script>', b'')
TypeError: replace() argument 1 must be str, not bytes
System Info
Oracle Linux 8.3
Note, this error does not happen on Rocky Linux 9 (but I don't compile llama-cpp-python manually there like I do on 8.3)
The text was updated successfully, but these errors were encountered:
Seems like modules\block_requests.py function my_open (on line 40) is opening the index.html template as plain text, but trying to replace the cloudflare request with bytes. Changed the replacements to be str and then used the correct io.TextIOWrapper to fix another error.
This issue has been closed due to inactivity for 2 months. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment.
Describe the bug
Attempting to start webui freom Oracle Linux 8.3 with a manual install of llama-cpp-python since manylinux is not compatible.
Receive an error related to
File "/data/text-generation-webui/modules/block_requests.py", line 46, in my_open
file_contents = file_contents.replace(b'\t\t<script\n\t\t\tsrc="https://cdnjs.cloudflare.com/ajax/libs/iframe-resizer/4.3.9/iframeResizer.contentWindow.min.js"\n\t\t\tasync\n\t\t></script>', b'')
TypeError: replace() argument 1 must be str, not bytes
but if I set share=True
the error goes away, but now I have an unwated public url
Is there an existing issue for this?
Reproduction
TypeError: replace() argument 1 must be str, not bytes
#commented out manylinux packages from requriements due to glibc <2.29 and intend on compiling llama-cpp-python
#then install commented out packages manually
then attempt to run
python server.py --api --listen --listen-host 192.168.3.18 --listen-port 7860 --n-gpu-layers 24 --threads 8 --numa --tensorcores --trust-remote-code
Screenshot
No response
Logs
System Info
Oracle Linux 8.3 Note, this error does not happen on Rocky Linux 9 (but I don't compile llama-cpp-python manually there like I do on 8.3)
The text was updated successfully, but these errors were encountered: