Skip to content

no idea what has happened but container really not happy #35

@chboishabba

Description

@chboishabba

Hi mate :)

Not really sure what's going on with the container.. It may be worth confirming the compiled ones I have uploaded to hub match yours.
In any case, both the originally working container, and subsequent recompiled ones appear to be crashing at some low kernel level (python3 status D in top) I'm not sure if I've corrupted my gpu's bios or something but it's not even outputting errors/segfault etc.

Steps to reproduce:

(Mon Jun 16 11:26:43) c@archb ~$ docker run -it --name whisperx_live_edit_test \

  --device=/dev/kfd --device=/dev/dri \

  --group-add video \

  --ipc=host \

  --cap-add SYS_PTRACE \

  --security-opt seccomp=unconfined \

  -e ROC_ENABLE_PRE_VEGA=1 \

  -e HSA_OVERRIDE_GFX_VERSION=8.0.3 \

  -e JOBLIB_MULTIPROCESSING=0 \

  -p 7860:7860 \

  rocm64_gfx803_whisperx:latest bash

Use "faster-whisper" implementation

Device "cuda" is detected

* Running on local URL:  http://0.0.0.0:7860

* Running on public URL: https://dff5ea454672538144.gradio.live


This share link expires in 1 week. For free permanent hosting and GPU upgrades, run `gradio deploy` from the terminal in the working directory to deploy to Hugging Face Spaces (https://huggingface.co/spaces) 

Set model to small.en, languge english, compute type int8, upload test audio and press Generate.

Page indicates 'Initialising Model'

(Mon Jun 16 12:11:06) c@archb ~$ docker logs whisperx_live_edit_test ███████████████████████████| 75.5M/75.5M [01:33<00:00, 820kB/s]
Use "faster-whisper" implementation
Device "cuda" is detected
* Running on local URL:  http://0.0.0.0:7860
* Running on public URL: https://dff5ea454672538144.gradio.live

This share link expires in 1 week. For free permanent hosting and GPU upgrades, run `gradio deploy` from the terminal in the working directory to deploy to Hugging Face Spaces (https://huggingface.co/spaces)
Use "faster-whisper" implementation
Device "cuda" is detected
* Running on local URL:  http://0.0.0.0:7860
* Running on public URL: https://7cac9cb28f978a3bdc.gradio.live

This share link expires in 1 week. For free permanent hosting and GPU upgrades, run `gradio deploy` from the terminal in the working directory to deploy to Hugging Face Spaces (https://huggingface.co/spaces)
Traceback (most recent call last):
  File "/Whisper-WebUI/modules/whisper/base_transcription_pipeline.py", line 272, in transcribe_file
    for file in files:
                ^^^^^
TypeError: 'NoneType' object is not iterable

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/Whisper-WebUI/venv/lib/python3.12/site-packages/gradio/queueing.py", line 625, in process_events
    response = await route_utils.call_process_api(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Whisper-WebUI/venv/lib/python3.12/site-packages/gradio/route_utils.py", line 322, in call_process_api
    output = await app.get_blocks().process_api(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Whisper-WebUI/venv/lib/python3.12/site-packages/gradio/blocks.py", line 2181, in process_api
    result = await self.call_function(
             ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Whisper-WebUI/venv/lib/python3.12/site-packages/gradio/blocks.py", line 1692, in call_function
    prediction = await anyio.to_thread.run_sync(  # type: ignore
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Whisper-WebUI/venv/lib/python3.12/site-packages/anyio/to_thread.py", line 56, in run_sync
    return await get_async_backend().run_sync_in_worker_thread(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Whisper-WebUI/venv/lib/python3.12/site-packages/anyio/_backends/_asyncio.py", line 2470, in run_sync_in_worker_thread
    return await future
           ^^^^^^^^^^^^
  File "/Whisper-WebUI/venv/lib/python3.12/site-packages/anyio/_backends/_asyncio.py", line 967, in run
    result = context.run(func, *args)
             ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Whisper-WebUI/venv/lib/python3.12/site-packages/gradio/utils.py", line 889, in wrapper
    response = f(*args, **kwargs)
               ^^^^^^^^^^^^^^^^^^
  File "/Whisper-WebUI/modules/whisper/base_transcription_pipeline.py", line 318, in transcribe_file
    raise RuntimeError(f"Error transcribing file: {e}") from e
RuntimeError: Error transcribing file: 'NoneType' object is not iterable
config.json: 100%|█████████████████████████████████████████████████████████████████████████████| 2.32k/2.32k [00:00<00:00, 2.97MB/s]
vocabulary.txt: 100%|█████████████████████████████████████████████████████████████████████████████| 422k/422k [00:00<00:00, 453kB/s]
tokenizer.json: 100%|███████████████████████████████████████████████████████████████████████████| 2.13M/2.13M [00:05<00:00, 408kB/s]
model.bin: 100%|████████████████████████████████████████████████████████████████████████████████| 75.5M/75.5M [01:33<00:00, 812kB/s]
(Mon Jun 16 12:11:08) c@archb ~$ docker stop whisperx_live_edit_test███████████████████████████| 75.5M/75.5M [01:33<00:00, 820kB/s]
Error response from daemon: cannot stop container: whisperx_live_edit_test: tried to kill container, but did not receive an exit event
(Mon Jun 16 12:11:40) c@archb ~$ docker stop whisperx_live_edit_test
Error response from daemon: cannot stop container: whisperx_live_edit_test: tried to kill container, but did not receive an exit event
(Mon Jun 16 12:12:14) c@archb ~$ 

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions