Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NVENC capable FFMPEG build for alpine #36

Closed
thebigbadoof opened this issue Apr 15, 2022 · 27 comments
Closed

NVENC capable FFMPEG build for alpine #36

thebigbadoof opened this issue Apr 15, 2022 · 27 comments

Comments

@thebigbadoof
Copy link

thebigbadoof commented Apr 15, 2022

hello,
I am trying to build sonarr-sma with ffmpeg nvidia, but am running into errors. I am attempting to build it with docker compose like this

    container_name: sonarr-sma
    build:
      context: https://github.com/mdhiggins/sonarr-sma.git#build
      args:
        - ffmpeg_tag=4.4-nvidia2004
        - sonarr_tag=develop
    deploy:
      resources:
        reservations:
          devices:
            - capabilities: [gpu, utility]
    volumes:
      - $DOCKERDIR/sonarr-sma/config:/config
      - $DOCKERDIR/sonarr-sma/sma:/usr/local/sma/config
      - '${DLDIR}/completed:/data/completed'
      - '${TVDIR}:/TV_Shows'
    networks:
      - t2_proxy
    restart: always
    environment:
      - PUID=$PUID
      - PGID=$PGID
      - SMA_HWACCEL=true
      - NVIDIA_VISIBLE_DEVICES=all
      - NVIDIA_DRIVER_CAPABILITIES=all

    labels:
      - "traefik.enable=true"
      - "diun.enable=true"
      - "flame.type=application" # "app" works too
      - "flame.name=sonarr-sma"
      # HTTP Routers
      - "traefik.http.routers.sonarr-sma-rtr.entrypoints=https"
      - "traefik.http.routers.sonarr-sma-rtr.rule=Host(`sma.$DOMAINNAME`)"
      ## Middlewares
      - "traefik.http.routers.sonarr-sma-rtr.middlewares=chain-authelia@file"      
      ## HTTP Services
      - "traefik.http.routers.sonarr-sma-rtr.service=sonarr-sma-svc"
      - "traefik.http.services.sonarr-sma-svc.loadbalancer.server.port=8989"

being my last attempt. I initially built it with ffmpeg_tag=4.4-nvidia and sonarr_tag=latest but ran into the issue

libnppig.so.10 cannot open shared object file: No such file or directory

when attempting to run ffmpeg or ffprobe. I read in mdhiggins/sickbeard_mp4_automator/issues/1470 that I needed to match the version so switched the tag to ffmpeg_tag=4.4-nvidia1804 for it to be no joy. I attempted a few other tags only to run into the same issue. I could manually run ffprobe and ffmpeg if I went into the container and did export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/cuda/lib64 but did not know how to apply it when built. I attempted messing with the dockerfile but was unsuccessful.

I then attempted to try to build with sonarr_tag=develop and ffmpeg_tag=4.4-nvidia2004 as seen by a couple of other users in issues, but it looks like the develop branch has moved their image to alpine which does not seem to work with jrottenberg's nvidia images I attempted a few other tags but I kept getting the errors below.

FileNotFoundError: [Errno 2] No such file or directory: '/usr/local/bin/ffprobe'
FileNotFoundError: [Errno 2] No such file or directory: '/usr/local/bin/ffmpeg'

Attempting to run manual.py,

abc@c650bbf26c8b:/$ "/usr/local/sma/venv/bin/python3" "/usr/local/sma/manual.py" -i "/TV_Shows/xxx/Season 1/xxx.mkv"
Manual processor started.
Python 64-bit 3.9.7 (default, Nov 24 2021, 21:15:59) 
[GCC 10.3.1 20211027].
Guessit version: 3.4.3.
/usr/local/sma/venv/bin/python3
Loading config file /usr/local/sma/config/autoProcess.ini. isValidSource unexpectedly threw an exception, returning None.
Traceback (most recent call last):
  File "/usr/local/sma/resources/mediaprocessor.py", line 314, in isValidSource
    info = self.converter.probe(inputfile)
  File "/usr/local/sma/converter/__init__.py", line 344, in probe
    return self.ffmpeg.probe(fname, posters_as_video)
  File "/usr/local/sma/converter/ffmpeg.py", line 610, in probe
    stdout_data = self._get_stdout([
  File "/usr/local/sma/converter/ffmpeg.py", line 560, in _get_stdout
    p = self._spawn(cmds)
  File "/usr/local/sma/converter/ffmpeg.py", line 553, in _spawn
    return Popen(cmds, shell=False, stdin=PIPE, stdout=PIPE, stderr=PIPE,
  File "/usr/lib/python3.9/subprocess.py", line 951, in __init__
    self._execute_child(args, executable, preexec_fn, close_fds,
  File "/usr/lib/python3.9/subprocess.py", line 1821, in _execute_child
    raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: '/usr/local/bin/ffprobe'

errors out like above. The only attempt I tried to do to fix this was this stackflow and adding libc6-compat as an extra package to install but to no avail.
I have verified that the files /usr/local/bin/ffprobe and /usr/local/bin/ffmpeg do exist in those locations as well.

I don't know what else to attempt so any insight would be greatly appreciated.

@mdhiggins
Copy link
Owner

I have yet to find a reliable ffmpeg container for nvenc and alpine linux but that's essentially what you need; linuxserver has been migrating all their containers over to alpine and I think that's why you're seeing issues

I did expand the capability of the build dockerfile to allow you use custom ffmpeg containers beyond the jrottenberg one, or pull from the repository. You can also specify a custom URL if someone has a premade binary for you

You can finally just create the binaries yourself and use volumes to mount them in your container like any other docker container

https://github.com/funkypenguin/embyserver-nvenc

Looks like this container compiles ffmpeg with nvenc under alpine, you could pull the binaries from that potentially though I haven't tested this

Open to suggestions if you find a container but that's what you need, Alpine base with FFMPEG built with NVENC

@mdhiggins
Copy link
Owner

mdhiggins commented Apr 15, 2022

Just to clarify some of the tags

  • build
    • ffmpeg_source by default is jrottenberg/ffmpeg
    • ffmpeg_tag 4.4-ubuntu
  • non-build
    • SMA_FFMPEG_URL to pull from a different source besides johnvansickle.com
    • SMA_STRIP_COMPONENTS if your archive needs a different level of component stripping to match directories
    • SMA_USE_REPO to just install ffmpeg package from the os repository
    • or just use volumes to mount your binaries at /usr/local/bin/ffmpeg and /usr/local/bin/ffprobe

@thebigbadoof
Copy link
Author

thebigbadoof commented Apr 15, 2022

Thank you for the insight!
I suppose the sonarr_tag=latest will one day go to alpine as well so any fixes on that side may be moot. I've briefly taken a look at the container you've listed but not optimistic as it has not been updated since 2018 and the readme says to not use it. I did a brief look around and I can see why you haven't found a reliable container for nvenc with alpine as the typical nvidia drivers are complied with glibc. recent issue asking for alpine support in nvidia-docker
I will see if I can make one of the methods you suggested work and report back.

@mdhiggins
Copy link
Owner

Definitely keep me posted as I assume this will become a bigger issue for more people when the latest container goes alpine in the near future

I unfortunately don't have an nvidia server to test against so any input is appreciated and it would be nice to get a solution

@thebigbadoof
Copy link
Author

Will do, and I'm sorry, I'm not very versed in docker building, but where does the #build get its dockerfile to build? the one in the repo doesn't seem to be what it is built from. I wanted to try and see if building proper glibc into the image would work and can see why my previous attempts to modify the dockerfile failed since I wasn't actually changing the correct one.

@mdhiggins
Copy link
Owner

https://github.com/mdhiggins/sonarr-sma/blob/build/Dockerfile

@thebigbadoof
Copy link
Author

I was successful in getting glibc into the image and running the programs, but it seems like I can't convert anything using nvidia as it would error out. I will have to revisit it again when I have more time sorry, but this is what you would add to the dockerfile

# Source: https://github.com/anapsix/docker-alpine-java

ENV GLIBC_REPO=https://github.com/sgerrand/alpine-pkg-glibc
ENV GLIBC_VERSION=2.30-r0

RUN set -ex && \
    apk --update add libstdc++ curl ca-certificates && \
    for pkg in glibc-${GLIBC_VERSION} glibc-bin-${GLIBC_VERSION}; \
        do curl -sSL ${GLIBC_REPO}/releases/download/${GLIBC_VERSION}/${pkg}.apk -o /tmp/${pkg}.apk; done && \
    apk add --allow-untrusted /tmp/*.apk && \
    rm -v /tmp/*.apk && \
    /usr/glibc-compat/sbin/ldconfig /lib /usr/glibc-compat/lib

and it was from stackflow post from earlier. I'm not sure if i was correctly testing, but I was using the command ffmpeg -i "xxx.mkv" -c:v h264_nvenc -preset hq test.mkv as this is my first time building and messing with sma and ffmpeg, I'm not even sure if nvidia could even be used. I did output a nvidia-smi which means the the container has gpu access and ffmpeg -hide_banner -hwaccels outputs cuda.

@mdhiggins
Copy link
Owner

what kind of ffmpeg error did you get?

@thebigbadoof
Copy link
Author

Hello, I would like to clarify my previous statement of "running the programs", I actually didn't run them per say, but actually just see if they could actually be called by just calling them from the prompt, ffprobe and ffmpeg. The errors that come when calling them are as follows.

calling ffprobe

ffprobe: /usr/lib/libgomp.so.1: no version information available (required by /usr/local/lib/libvidstab.so.1.1)
ffprobe: /usr/lib/libgomp.so.1: no version information available (required by /usr/local/lib/libvidstab.so.1.1)
ffprobe: /usr/lib/libgomp.so.1: no version information available (required by /usr/local/lib/libvidstab.so.1.1)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libx265.so.192)
ffprobe: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libx265.so.192)
ffprobe version 4.4.1 Copyright (c) 2007-2021 the FFmpeg developers
  built with gcc 9 (Ubuntu 9.4.0-1ubuntu1~20.04.1)
  configuration: --disable-debug --disable-doc --disable-ffplay --enable-avresample --enable-cuda --enable-cuvid --enable-fontconfig --enable-gpl --enable-libaom --enable-libaribb24 --enable-libass --enable-libbl
uray --enable-libfdk_aac --enable-libfreetype --enable-libkvazaar --enable-libmp3lame --enable-libnpp --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libsrt --
enable-libtheora --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxcb --enable-libxvid --enable-libzmq --enable-nonfree --enable-nvenc --enable
-openssl --enable-postproc --enable-shared --enable-small --enable-version3 --extra-cflags='-I/opt/ffmpeg/include -I/opt/ffmpeg/include/ffnvcodec -I/usr/local/cuda/include/' --extra-ldflags='-L/opt/ffmpeg/lib -L/
usr/local/cuda/lib64 -L/usr/local/cuda/lib32/' --extra-libs=-ldl --extra-libs=-lpthread --prefix=/opt/ffmpeg
  libavutil      56. 70.100 / 56. 70.100
  libavcodec     58.134.100 / 58.134.100
  libavformat    58. 76.100 / 58. 76.100
  libavdevice    58. 13.100 / 58. 13.100
  libavfilter     7.110.100 /  7.110.100
  libavresample   4.  0.  0 /  4.  0.  0
  libswscale      5.  9.100 /  5.  9.100
  libswresample   3.  9.100 /  3.  9.100
  libpostproc    55.  9.100 / 55.  9.100
Simple multimedia streams analyzer
usage: ffprobe [OPTIONS] [INPUT_FILE]

You have to specify one input file.
Use -h to get full help or, even better, run 'man ffprobe'.

calling ffmpeg

ffmpeg: /usr/lib/libgomp.so.1: no version information available (required by /usr/local/lib/libvidstab.so.1.1)
ffmpeg: /usr/lib/libgomp.so.1: no version information available (required by /usr/local/lib/libvidstab.so.1.1)
ffmpeg: /usr/lib/libgomp.so.1: no version information available (required by /usr/local/lib/libvidstab.so.1.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libx265.so.192)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libx265.so.192)
ffmpeg version 4.4.1 Copyright (c) 2000-2021 the FFmpeg developers
  built with gcc 9 (Ubuntu 9.4.0-1ubuntu1~20.04.1)
  configuration: --disable-debug --disable-doc --disable-ffplay --enable-avresample --enable-cuda --enable-cuvid --enable-fontconfig --enable-gpl --enable-libaom --enable-libaribb24 --enable-libass --enable-libbl
uray --enable-libfdk_aac --enable-libfreetype --enable-libkvazaar --enable-libmp3lame --enable-libnpp --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libsrt --
enable-libtheora --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxcb --enable-libxvid --enable-libzmq --enable-nonfree --enable-nvenc --enable
-openssl --enable-postproc --enable-shared --enable-small --enable-version3 --extra-cflags='-I/opt/ffmpeg/include -I/opt/ffmpeg/include/ffnvcodec -I/usr/local/cuda/include/' --extra-ldflags='-L/opt/ffmpeg/lib -L/
usr/local/cuda/lib64 -L/usr/local/cuda/lib32/' --extra-libs=-ldl --extra-libs=-lpthread --prefix=/opt/ffmpeg
  libavutil      56. 70.100 / 56. 70.100
  libavcodec     58.134.100 / 58.134.100
  libavformat    58. 76.100 / 58. 76.100
  libavdevice    58. 13.100 / 58. 13.100
  libavfilter     7.110.100 /  7.110.100
  libavresample   4.  0.  0 /  4.  0.  0
  libswscale      5.  9.100 /  5.  9.100
  libswresample   3.  9.100 /  3.  9.100
  libpostproc    55.  9.100 / 55.  9.100
Hyper fast Audio and Video encoder
usage: ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}...

Use -h to get full help or, even better, run 'man ffmpeg'

and calling the above command to test

ffmpeg: /usr/lib/libgomp.so.1: no version information available (required by /usr/local/lib/libvidstab.so.1.1)
ffmpeg: /usr/lib/libgomp.so.1: no version information available (required by /usr/local/lib/libvidstab.so.1.1)
ffmpeg: /usr/lib/libgomp.so.1: no version information available (required by /usr/local/lib/libvidstab.so.1.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libzmq.so.5)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libsrt.so.1)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libx265.so.192)
ffmpeg: /usr/lib/libstdc++.so.6: no version information available (required by /usr/local/lib/libx265.so.192)
ffmpeg version 4.4.1 Copyright (c) 2000-2021 the FFmpeg developers
  built with gcc 9 (Ubuntu 9.4.0-1ubuntu1~20.04.1)
  configuration: --disable-debug --disable-doc --disable-ffplay --enable-avresample --enable-cuda --enable-cuvid --enable-fontconfig --enable-gpl --enable-libaom --enable-libaribb24 --enable-libass --enable-libbl
uray --enable-libfdk_aac --enable-libfreetype --enable-libkvazaar --enable-libmp3lame --enable-libnpp --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libsrt --
enable-libtheora --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxcb --enable-libxvid --enable-libzmq --enable-nonfree --enable-nvenc --enable
-openssl --enable-postproc --enable-shared --enable-small --enable-version3 --extra-cflags='-I/opt/ffmpeg/include -I/opt/ffmpeg/include/ffnvcodec -I/usr/local/cuda/include/' --extra-ldflags='-L/opt/ffmpeg/lib -L/
usr/local/cuda/lib64 -L/usr/local/cuda/lib32/' --extra-libs=-ldl --extra-libs=-lpthread --prefix=/opt/ffmpeg
  libavutil      56. 70.100 / 56. 70.100
  libavcodec     58.134.100 / 58.134.100
  libavformat    58. 76.100 / 58. 76.100
  libavdevice    58. 13.100 / 58. 13.100
  libavfilter     7.110.100 /  7.110.100
  libavresample   4.  0.  0 /  4.  0.  0
  libswscale      5.  9.100 /  5.  9.100
  libswresample   3.  9.100 /  3.  9.100
  libpostproc    55.  9.100 / 55.  9.100
Input #0, matroska,webm, from 'xxx.mkv':
  Metadata:
    encoder         : no_variable_data
    creation_time   : 1970-01-01T00:00:00.000000Z
  Duration: 00:23:40.20, start: 0.000000, bitrate: 4133 kb/s
  Stream #0:0: Video: h264, yuv420p(progressive), 1280x720 [SAR 1:1 DAR 16:9], 23.98 fps, 23.98 tbr, 1k tbn, 47.95 tbc (default)
    Metadata:
      BPS-eng         : 3970145
      DURATION-eng    : 00:23:40.129000000
      NUMBER_OF_FRAMES-eng: 34049
      NUMBER_OF_BYTES-eng: 704764839
      _STATISTICS_WRITING_APP-eng: no_variable_data
      _STATISTICS_WRITING_DATE_UTC-eng: 1970-01-01 00:00:00
      _STATISTICS_TAGS-eng: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES
  Stream #0:1(jpn): Audio: aac, 44100 Hz, stereo, fltp (default)
    Metadata:
      BPS-eng         : 128000
      DURATION-eng    : 00:23:40.202000000
      NUMBER_OF_FRAMES-eng: 61163
      NUMBER_OF_BYTES-eng: 22723234
      _STATISTICS_WRITING_APP-eng: no_variable_data
      _STATISTICS_WRITING_DATE_UTC-eng: 1970-01-01 00:00:00
      _STATISTICS_TAGS-eng: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES
  Stream #0:2(eng): Subtitle: ass (default)
    Metadata:
      title           : English subs
      BPS-eng         : 145
      DURATION-eng    : 00:23:40.100000000
      NUMBER_OF_FRAMES-eng: 411
      NUMBER_OF_BYTES-eng: 25758
      _STATISTICS_WRITING_APP-eng: no_variable_data
      _STATISTICS_WRITING_DATE_UTC-eng: 1970-01-01 00:00:00
      _STATISTICS_TAGS-eng: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES
  Stream #0:3: Attachment: ttf
    Metadata:
      filename        : Roboto-Medium.ttf
      mimetype        : application/x-truetype-font
  Stream #0:4: Attachment: ttf
    Metadata:
      filename        : Roboto-MediumItalic.ttf
      mimetype        : application/x-truetype-font
  Stream #0:5: Attachment: ttf
    Metadata:
      filename        : arial.ttf
      mimetype        : application/x-truetype-font
  Stream #0:6: Attachment: ttf
    Metadata:
      filename        : arialbd.ttf
      mimetype        : application/x-truetype-font
  Stream #0:7: Attachment: ttf
    Metadata:
      filename        : comic.ttf
      mimetype        : application/x-truetype-font
  Stream #0:8: Attachment: ttf
    Metadata:
      filename        : comicbd.ttf
      mimetype        : application/x-truetype-font
  Stream #0:9: Attachment: ttf
    Metadata:
      filename        : times.ttf
      mimetype        : application/x-truetype-font
  Stream #0:10: Attachment: ttf
    Metadata:
      filename        : timesbd.ttf
      mimetype        : application/x-truetype-font
  Stream #0:11: Attachment: ttf
    Metadata:
      filename        : trebuc.ttf
      mimetype        : application/x-truetype-font
  Stream #0:12: Attachment: ttf
    Metadata:
      filename        : trebucbd.ttf
      mimetype        : application/x-truetype-font
  Stream #0:13: Attachment: ttf
    Metadata:
      filename        : verdana.ttf
      mimetype        : application/x-truetype-font
  Stream #0:14: Attachment: ttf
    Metadata:
      filename        : verdanab.ttf
      mimetype        : application/x-truetype-font
File 'test.mkv' already exists. Overwrite? [y/N] y
Stream mapping:
  Stream #0:0 -> #0:0 (h264 (native) -> h264 (h264_nvenc))
  Stream #0:1 -> #0:1 (aac (native) -> vorbis (libvorbis))
  Stream #0:2 -> #0:2 (ass (ssa) -> ass (ssa))
Press [q] to stop, [?] for help
[h264_nvenc @ 0x559011b59f80] Cannot load libcuda.so.1
Error initializing output stream 0:0 -- Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height
[libvorbis @ 0x55901215ed40] 39 frames left in the queue on closing
Conversion failed!

I'm not very versed in ffmpeg, but I hope that helps!

TLDR your question Cannot load libcuda.so.1

@mdhiggins
Copy link
Owner

Hm you got pretty far there, what command was it that you were running that's giving you that error message?

Error initializing output stream 0:0 -- Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height

@mdhiggins mdhiggins changed the title ffmpeg with nvidia builtin NVENC capable FFMPEG build for alpine Apr 17, 2022
@thebigbadoof
Copy link
Author

I was running ffmpeg -i "xxx.mkv" -c:v h264_nvenc -preset hq test.mkv

@mdhiggins
Copy link
Owner

Not sure if the preset could potentially be creating issues but a cleaner test command that eliminates the other encoders and removes any extra parameters is probably worthwhile

ffmpeg -i "xxx.mkv" -map 0:0 -c:v h264_nvenc test.mkv

Additionally, can try with a bitrate

ffmpeg -i "xxx.mkv" -map 0:0 -c:v -b:v 2000 h264_nvenc test.mkv

If that doesn't work then it probably still isn't correctly compiled

@thebigbadoof
Copy link
Author

ffmpeg -i "xxx.mkv" -map 0:0 -c:v h264_nvenc test.mkv gave me the same error as the other command which was Cannot load libcuda.so.1

ffmpeg -i "xxx.mkv" -map 0:0 -c:v -b:v 2000 h264_nvenc test.mkv gave me Unable to find a suitable output format for '2000'.

So even with building glibc in, it doesn't work. it may be I need to both look at the dockerfile that jrottenberg does for alpine and also the nvidia docker file to see if building ffmpeg with alpine as the base on top of glibc inside that container would make it work... When I have more time I'll see if that'll work.

@thebigbadoof
Copy link
Author

So I did not have anymore luck trying to get an alpine capable ffmpeg build, so I attempted to just downgrade sonarr to the update right before the switch to alpine, which I found to be develop-3.0.6.1460-ls250. However, I still could not get nvenc to work.

[h264_nvenc @ 0x55bb1928e3c0] Cannot load libnvidia-encode.so.1
[h264_nvenc @ 0x55bb1928e3c0] The minimum required Nvidia driver for nvenc is (unknown) or newer

It seems like something with this container does not like ffmpeg with nvenc. The container seems to have gpu access with commands like nvidia-smi outputting correctly. I built ffmpeg with nvenc and confirmed it worked locally (not in the container) and also installed jellyfin's ffmpeg but still could not get it to work. The drivers shouldn't be a problem because both plex and jellyfin use it and transcode in nvenc fine. So I'm at a loss. FYI provided I don't use h264_nvenc, ffmpeg seems to work fine on all 3 versions I tried (self built, jrottenberg, and jellyfin)
I'm not sure if sonarr did something to their develop image that prevents nvenc from correctly running with ffmpeg, or maybe how the image is built is missing something?

@mdhiggins
Copy link
Owner

This should be working, I know other users before the alpine swtich had working NVEnc setups, here's a post for reference

mdhiggins/sickbeard_mp4_automator#1444

Not sure that would be helpful to compare

What jrottenberg tag were you using for ffmpeg?

@mdhiggins
Copy link
Owner

@KaHooli Just curious if you have encountered any solutions to getting nvenc ffmpeg builds on alpine since I know you were using a similar setup before they migrated from ubuntu

@thebigbadoof
Copy link
Author

thebigbadoof commented Apr 27, 2022

Yeah that was why I'm stumped I've seen previous posts where it worked just fine for people but they were from last year so not sure if sonarr did something on their end. I've tried a lot of the tags, but the ones i've built with I believe is 4.3-nvidia2004 all the way to 5.0. I also tried to make it work on latest which was still on 1804 and those nvidia builds were a year old. I also built my own and tested it worked so the ffmpeg is probably not the issue but the container itself is doing something weird with the nvidia drivers I can only assume since using nvidia-docker2 passes your drivers to containers.
Using nvidia-smi command

+-----------------------------------------------------------------------------+
| NVIDIA-SMI 470.103.01   Driver Version: 470.103.01   CUDA Version: 11.4     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  NVIDIA GeForce ...  Off  | 00000000:01:00.0  On |                  N/A |
|  0%   36C    P8     5W / 120W |    109MiB /  3016MiB |      0%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
+-----------------------------------------------------------------------------+

Using cat /proc/driver/nvidia/version

NVRM version: NVIDIA UNIX x86_64 Kernel Module  470.103.01  Thu Jan  6 12:10:04 UTC 2022
GCC version:  gcc version 9.4.0 (Ubuntu 9.4.0-1ubuntu1~20.04.1) 

and they're identical to both my jellyfin and plex containers.

sonarr container os

NAME="Ubuntu"
VERSION="20.04.3 LTS (Focal Fossa)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 20.04.3 LTS"
VERSION_ID="20.04"

jellyfin

NAME="Ubuntu"
VERSION="20.04.4 LTS (Focal Fossa)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 20.04.4 LTS"
VERSION_ID="20.04"

plex

NAME="Ubuntu"
VERSION="20.04.4 LTS (Focal Fossa)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 20.04.4 LTS"
VERSION_ID="20.04"

host os

Distributor ID:Ubuntu
Description:Ubuntu 20.04.4 LTS
Release:20.04
Codename:focal

As I'm writing this, I see that the sonarr is on 20.04.3 whereas 20.04.4 is what plex, jellyfin and my host is on. I'll try and see if I can get a 20.04.4 image for sonarr.
Sorry, I'm kind of just writing out loud to get ideas of what to do since I'd like this to work lol

@bradfrosty
Copy link

bradfrosty commented May 5, 2022

I also have been running into this. Was finally able to get it working, with no changes needed to the underlying Dockerfile itself.

For me there were two general issues:
A) The alpine build of linuxserver/sonarr is not compatible with nvidia. This seems to be prevalent across several other alpine containers.
B) The compose service needs to explicitly enable Nvidia features

To address problem A, I went with the simple route. Instead of using sonarr_tag: develop as shown in the example in the README, I omitted this ARG. Since the latest tag is still using ubuntu, this eliminates issues with alpine.

To address problem B, there were four steps:

  1. Add runtime: nvidia to the sonarr compose service
  2. Add the environment variables NVIDIA_VISIBLE_DEVICES: all and NVIDIA_DRIVER_CAPABILITIES: all. The latter is important because it mounts libnvidia-encode.so.1 into the container, necessary for using nvenc.
  3. Add the environment variable LD_LIBRARY_PATH: /usr/local/cuda/lib64. ffmpeg was unable to find cuda shared libraries without this.

One issue that might be specific to my version was that /usr/loca/cuda/lib64 did not exist. It was instead under the path /usr/local/cuda-11.4/lib64. I set my LD_LIBRARY_PATH to this instead rather than renaming the dir. Not sure why it isn't under the expected path.

Additionally, you might need to make sure your nvidia-docker is properly setup on the host machine, with the latest supported drivers for the version of ffmpeg you have installed. I am personally using ffmpeg 5.0.1, cuda 11.6, and nvidia driver 510.60.02. Even though my docker runtime is using 11.4, it still worked fine.

After all of this, was able to verify by running manual.py and confirming the GPU ffmpeg usage with nvidia-smi.

Here's my configs for reference:

# docker-compose.yaml
sonarr:
  restart: unless-stopped
  build:
    context: https://github.com/mdhiggins/sonarr-sma.git#build
    args:
      ffmpeg_tag: 5.0.1-nvidia2004
  environment:
    PUID: ${PUID}
    PGID: ${PGID}
    TZ: ${TZ}
    SMA_HWACCEL: true
    NVIDIA_VISIBLE_DEVICES: all
    NVIDIA_DRIVER_CAPABILITIES: all
    LD_LIBRARY_PATH: /usr/local/cuda-11.4/lib64
  runtime: nvidia
  security_opt:
    - no-new-privileges:true
  volumes:
    - ./config/sonarr:/config
    - ./config/sma:/usr/local/sma/config
    - /mnt/storage/downloads:/downloads
    - /mnt/storage/gmedia:/media
; autoProcess.ini
[Converter]
ffmpeg = ffmpeg
ffprobe = ffprobe
threads = 0
hwaccels = nvenc, cuvid, vaapi, dxva2, qsv, d3d11va
hwaccel-decoders = hevc_cuvid, h264_cuvid, mjpeg_cuvid, mpeg1_cuvid, mpeg2_cuvid, mpeg4_cuvid, vc1_cuvid, vp8_cuvid, vp9_cuvid, hevc_qsv, h264_qsv, hevc_vaapi, h264_vaapi
hwdevices = cuda:0
hwaccel-output-format = cuda:cuda
output-directory =
output-format = mp4
output-extension = mp4
temp-extension =
minimum-size = 0
ignored-extensions = nfo, ds_store
copy-to =
move-to =
delete-original = True
sort-streams = True
process-same-extensions = False
bypass-if-copying-all = False
force-convert = False
post-process = False
wait-post-process = False
detailed-progress = False
opts-separator = ,
preopts =
postopts =
regex-directory-replace = [^\w\-_\. ]
temp-output = False

[Permissions]
chmod = 0644
uid = -1
gid = -1

[Metadata]
relocate-moov = True
full-path-guess = True
tag = True
tag-language = eng
download-artwork = poster
sanitize-disposition =
strip-metadata = False
keep-titles = False

[Video]
codec = hevc_nvenc, hevc
max-bitrate = 0
bitrate-ratio =
crf = -1
crf-profiles =
preset =
codec-parameters =
dynamic-parameters = False
max-width = 0
profile =
max-level = 0.0
pix-fmt =
filter =
force-filter = False
prioritize-source-pix-fmt = True

[HDR]
codec =
pix-fmt =
space = bt2020nc
transfer = smpte2084
primaries = bt2020
preset =
codec-parameters =
filter =
force-filter = False
profile =

[Audio]
codec = ac3
languages =
default-language =
first-stream-of-language = False
allow-language-relax = True
channel-bitrate = 128
variable-bitrate = 0
max-bitrate = 0
max-channels = 2
prefer-more-channels = True
filter =
profile =
force-filter = False
sample-rates =
sample-format =
copy-original = False
aac-adtstoasc = False
ignore-truehd = mp4, m4v
ignored-dispositions =
unique-dispositions = False
stream-codec-combinations =

[Universal Audio]
codec = aac
channel-bitrate = 128
variable-bitrate = 0
first-stream-only = False
filter =
profile =
force-filter = False

[Audio.ChannelFilters]
6-2 = pan=stereo|FL=0.5*FC+0.707*FL+0.707*BL+0.5*LFE|FR=0.5*FC+0.707*FR+0.707*BR+0.5*LFE

[Subtitle]
codec = mov_text
codec-image-based =
languages =
default-language =
first-stream-of-language = False
encoding =
burn-subtitles = False
burn-dispositions =
embed-subs = True
embed-image-subs = False
embed-only-internal-subs = False
filename-dispositions = forced
ignore-embedded-subs = False
ignored-dispositions =
unique-dispositions = False
attachment-codec =

[Subtitle.CleanIt]
enabled = False
config-path =
tags =

[Subtitle.Subliminal]
download-subs = False
download-hearing-impaired-subs = False
providers =

[Subtitle.Subliminal.Auth]
opensubtitles =
tvsubtitles =

[Sonarr]
host = 127.0.0.1
port = 8989
apikey = 
ssl = False
webroot =
force-rename = False
rescan = True
block-reprocess = False

[Radarr]
host = 127.0.0.1
port = 7878
apikey = 
ssl = False
webroot =
force-rename = False
rescan = True
block-reprocess = False

[Sickbeard]
host = localhost
port = 8081
ssl = False
apikey =
webroot =
username =
password =

[Sickrage]
host = localhost
port = 8081
ssl = False
apikey =
webroot =
username =
password =

[SABNZBD]
convert = True
sickbeard-category = sickbeard
sickrage-category = sickrage
sonarr-category = sonarr
radarr-category = radarr
bypass-category = bypass
output-directory =
path-mapping =

[Deluge]
sickbeard-label = sickbeard
sickrage-label = sickrage
sonarr-label = sonarr
radarr-label = radarr
bypass-label = bypass
convert = True
host = localhost
port = 58846
username =
password =
output-directory =
remove = False
path-mapping =

[qBittorrent]
sickbeard-label = sickbeard
sickrage-label = sickrage
sonarr-label = sonarr
radarr-label = radarr
bypass-label = bypass
convert = True
action-before =
action-after =
host = localhost
port = 8080
ssl = False
username =
password =
output-directory =
path-mapping =

[uTorrent]
sickbeard-label = sickbeard
sickrage-label = sickrage
sonarr-label = sonarr
radarr-label = radarr
bypass-label = bypass
convert = True
webui = False
action-before =
action-after =
host = localhost
ssl = False
port = 8080
username =
password =
output-directory =
path-mapping =

[Plex]
host = localhost
port = 32400
refresh = False
token =

[Audio.Sorting]
sorting = language, channels.d, map, d.comment
default-sorting = channels.d, map, d.comment
codecs =

[Subtitle.Sorting]
sorting = language, d.comment, d.default.d, d.forced.d
codecs =

@mdhiggins
Copy link
Owner

This unfortunately will be a short-lived solution as the latest tag will probably be updated to alpine in the near future but I appreciate you sharing what you've learned

@bradfrosty
Copy link

It seems like until nvidia-docker supports alpine, those using hwaccel with nvidia gpu will need to stick to the last ubuntu release tag then.

@mdhiggins
Copy link
Owner

Looks like as of the recent develop branch commit they are rebasing sonarr back to ubuntu

@wdckwrth
Copy link

wdckwrth commented Nov 7, 2022

It looks like sonarr is rebased to focal. Does your post imply that nvenc should be "do-able" now? I have been poking away at it but can't get a functioning ffmpeg installed. I am using the nvidia runtime and can run nvidia-smi in the container. Thanks.

@mdhiggins
Copy link
Owner

What build config are you using?

@wdckwrth
Copy link

wdckwrth commented Nov 10, 2022

Been using this one:

  container_name: sonarr-sma
    build:
      context: https://github.com/mdhiggins/sonarr-sma.git#build
      args:
        ffmpeg_tag: 5.1-nvidia2004
        sonarr_tag: latest

ffmpeg -h
ffmpeg: error while loading shared libraries: libnppig.so.11: cannot open shared object file: No such file or directory

Adding in extra_packages:

    container_name: sonarr-sma
    build:
      context: https://github.com/mdhiggins/sonarr-sma.git#build
      args:
        ffmpeg_tag: 5.1-nvidia2004
        sonarr_tag: latest
        extra_packages: libnppig10

Yields:

cont-init: info: running /etc/cont-init.d/90-sma-config
E: Package 'libva' has no installation candidate
cont-init: info: /etc/cont-init.d/90-sma-config exited 0

Using:

    container_name: sonarr-sma
    build:
      context: https://github.com/mdhiggins/sonarr-sma.git#build
      args:
        ffmpeg_tag: 4.4-nvidia2004
        sonarr_tag: develop

Also returns the package 'libva' has no installation candidate error in the logs.

And then this from ffmpeg -h

ffmpeg: error while loading shared libraries: libnppig.so.11: cannot open shared object file: No such file or directory

@mdhiggins
Copy link
Owner

mdhiggins commented Nov 10, 2022

Hm see if trying the package libva2 fixes things

@thebigbadoof
Copy link
Author

you probably have to link the cuda libraries, as bradfrosty did earlier. LD_LIBRARY_PATH: /usr/local/cuda/lib64 was where his shared library lived, mine lives in /usr/local/cuda-11.4/lib64. I'm unsure as to why the default link doesn't work nor why there are different locations for the libraries. Since this issue is no longer applicable with the rebase, I will close it. I haven't revisited this topic in a long time as I stopped trying things but I believe the general guidance from the other issues should suffice. Thanks for all the help before mdhiggns and bradfrosty.

@wdckwrth
Copy link

@thebigbadoof had it. I needed to add in the library path. So I would say the issue is still there with the rebase but adding the path seems to be the workaround for 3 of us now.

I am still getting the libva issue but I have a functioning ffmpeg in the container. Going to keep working on testing. I will start a new issue/thread for other issues. Thanks @thebigbadoof and @mdhiggins!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants