Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: replace() argument 1 must be str, not bytes #5839

Closed
1 task done
thistleknot opened this issue Apr 10, 2024 · 2 comments
Closed
1 task done

TypeError: replace() argument 1 must be str, not bytes #5839

thistleknot opened this issue Apr 10, 2024 · 2 comments
Labels
bug Something isn't working stale

Comments

@thistleknot
Copy link

Describe the bug

Attempting to start webui freom Oracle Linux 8.3 with a manual install of llama-cpp-python since manylinux is not compatible.

Receive an error related to

File "/data/text-generation-webui/modules/block_requests.py", line 46, in my_open
file_contents = file_contents.replace(b'\t\t<script\n\t\t\tsrc="https://cdnjs.cloudflare.com/ajax/libs/iframe-resizer/4.3.9/iframeResizer.contentWindow.min.js"\n\t\t\tasync\n\t\t></script>', b'')
TypeError: replace() argument 1 must be str, not bytes

but if I set share=True

the error goes away, but now I have an unwated public url

Is there an existing issue for this?

  • I have searched the existing issues

Reproduction

TypeError: replace() argument 1 must be str, not bytes

#commented out manylinux packages from requriements due to glibc <2.29 and intend on compiling llama-cpp-python

#ensure these lines are commented out
(textgen) [root@pve0 text-generation-webui]# cat requirements_noavx2.txt | grep "#"
# API
# llama-cpp-python (CPU only, no AVX2)
#https://github.com/oobabooga/llama-cpp-python-cuBLAS-wheels/releases/download/cpu/llama_cpp_python-0.2.56+cpuavx-cp311-cp311-manylinux_2_31_x86_64.whl; platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.11"
#https://github.com/oobabooga/llama-cpp-python-cuBLAS-wheels/releases/download/cpu/llama_cpp_python-0.2.56+cpuavx-cp310-cp310-manylinux_2_31_x86_64.whl; platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.10"
# llama-cpp-python (CUDA, no tensor cores)
#https://github.com/oobabooga/llama-cpp-python-cuBLAS-wheels/releases/download/textgen-webui/llama_cpp_python_cuda-0.2.56+cu121avx-cp311-cp311-manylinux_2_31_x86_64.whl; platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.11"
#https://github.com/oobabooga/llama-cpp-python-cuBLAS-wheels/releases/download/textgen-webui/llama_cpp_python_cuda-0.2.56+cu121avx-cp310-cp310-manylinux_2_31_x86_64.whl; platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.10"
# llama-cpp-python (CUDA, tensor cores)
#https://github.com/oobabooga/llama-cpp-python-cuBLAS-wheels/releases/download/textgen-webui/llama_cpp_python_cuda_tensorcores-0.2.56+cu121avx-cp311-cp311-manylinux_2_31_x86_64.whl; platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.11"
#https://github.com/oobabooga/llama-cpp-python-cuBLAS-wheels/releases/download/textgen-webui/llama_cpp_python_cuda_tensorcores-0.2.56+cu121avx-cp310-cp310-manylinux_2_31_x86_64.whl; platform_system == "Linux" and platform_machine == "x86_64" and python_version == "3.10"
# CUDA wheels

#then install commented out packages manually

rm -rf llama-cpp-python; git clone --recurse-submodules https://github.com/abetlen/llama-cpp-python.git; cd llama-cpp-python/;CMAKE_ARGS="-DLLAMA_CUBLAS=on -DLLAMA_AVX=on -DLLAMA_AVX2=off" pip install -e .[all] --force

then attempt to run
python server.py --api --listen --listen-host 192.168.3.18 --listen-port 7860 --n-gpu-layers 24 --threads 8 --numa --tensorcores --trust-remote-code

Screenshot

No response

Logs

(textgen) [root@pve0 text-generation-webui]# python server.py --api --listen --listen-port 7860 --n-gpu-layers 24 --threads 8 --numa --tensorcores --trust-remote-code
18:06:59-333880 INFO     Starting Text generation web UI
18:06:59-343693 WARNING  trust_remote_code is enabled. This is dangerous.
18:06:59-344709 WARNING
                         You are potentially exposing the web UI to the entire internet without any access password.
                         You can create one with the "--gradio-auth" flag like this:

                         --gradio-auth username:password

                         Make sure to replace username:password with your own.
18:06:59-346542 INFO     Loading settings from "settings.yaml"
18:06:59-359341 INFO     Loading the extension "openai"
18:06:59-740049 INFO     OpenAI-compatible API URL:

                         http://0.0.0.0:5000

18:06:59-743648 INFO     Loading the extension "gallery"

Running on local URL:  http://0.0.0.0:7860

Exception in ASGI application
Traceback (most recent call last):
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 411, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in __call__
    return await self.app(scope, receive, send)
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/gradio/route_utils.py", line 680, in __call__
    await self.app(scope, receive, send)
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/routing.py", line 756, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/routing.py", line 776, in app
    await route.handle(scope, receive, send)
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
    await self.app(scope, receive, send)
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/routing.py", line 72, in app
    response = await func(request)
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/fastapi/routing.py", line 278, in app
    raw_response = await run_endpoint_function(
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/fastapi/routing.py", line 193, in run_endpoint_function
    return await run_in_threadpool(dependant.call, **values)
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/concurrency.py", line 42, in run_in_threadpool
    return await anyio.to_thread.run_sync(func, *args)
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
    return await get_async_backend().run_sync_in_worker_thread(
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2144, in run_sync_in_worker_thread
    return await future
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 851, in run
    result = context.run(func, *args)
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/gradio/routes.py", line 377, in main
    return templates.TemplateResponse(
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/templating.py", line 229, in TemplateResponse
    template = self.get_template(name)
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/starlette/templating.py", line 143, in get_template
    return self.env.get_template(name)
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/jinja2/environment.py", line 1010, in get_template
    return self._load_template(name, globals)
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/jinja2/environment.py", line 969, in _load_template
    template = self.loader.load(self, name, self.make_globals(globals))
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/jinja2/loaders.py", line 125, in load
    source, filename, uptodate = self.get_source(environment, name)
  File "/root/miniconda3/envs/textgen/lib/python3.10/site-packages/jinja2/loaders.py", line 206, in get_source
    with open(filename, encoding=self.encoding) as f:
  File "/data/text-generation-webui/modules/block_requests.py", line 46, in my_open
    file_contents = file_contents.replace(b'\t\t<script\n\t\t\tsrc="https://cdnjs.cloudflare.com/ajax/libs/iframe-resizer/4.3.9/iframeResizer.contentWindow.min.js"\n\t\t\tasync\n\t\t></script>', b'')
TypeError: replace() argument 1 must be str, not bytes

System Info

Oracle Linux 8.3

Note, this error does not happen on Rocky Linux 9 (but I don't compile llama-cpp-python manually there like I do on 8.3)
@Googolplexed0
Copy link

Seems like modules\block_requests.py function my_open (on line 40) is opening the index.html template as plain text, but trying to replace the cloudflare request with bytes. Changed the replacements to be str and then used the correct io.TextIOWrapper to fix another error.

def my_open(*args, **kwargs):
    filename = str(args[0])
    if filename.endswith('index.html'):
        with original_open(*args, **kwargs) as f:
            file_contents = f.read()

        file_contents = file_contents.replace('\t\t<script\n\t\t\tsrc="https://cdnjs.cloudflare.com/ajax/libs/iframe-resizer/4.3.9/iframeResizer.contentWindow.min.js"\n\t\t\tasync\n\t\t></script>', '')
        file_contents = file_contents.replace('cdnjs.cloudflare.com', '127.0.0.1')

        
        return io.TextIOWrapper(io.BytesIO(file_contents.encode(kwargs["encoding"])))
    else:
        return original_open(*args, **kwargs)

Copy link

This issue has been closed due to inactivity for 2 months. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working stale
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants