-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NVENC capable FFMPEG build for alpine #36
Comments
I have yet to find a reliable ffmpeg container for nvenc and alpine linux but that's essentially what you need; linuxserver has been migrating all their containers over to alpine and I think that's why you're seeing issues I did expand the capability of the build dockerfile to allow you use custom ffmpeg containers beyond the jrottenberg one, or pull from the repository. You can also specify a custom URL if someone has a premade binary for you You can finally just create the binaries yourself and use volumes to mount them in your container like any other docker container https://github.com/funkypenguin/embyserver-nvenc Looks like this container compiles ffmpeg with nvenc under alpine, you could pull the binaries from that potentially though I haven't tested this Open to suggestions if you find a container but that's what you need, Alpine base with FFMPEG built with NVENC |
Just to clarify some of the tags
|
Thank you for the insight! |
Definitely keep me posted as I assume this will become a bigger issue for more people when the latest container goes alpine in the near future I unfortunately don't have an nvidia server to test against so any input is appreciated and it would be nice to get a solution |
Will do, and I'm sorry, I'm not very versed in docker building, but where does the #build get its dockerfile to build? the one in the repo doesn't seem to be what it is built from. I wanted to try and see if building proper glibc into the image would work and can see why my previous attempts to modify the dockerfile failed since I wasn't actually changing the correct one. |
I was successful in getting glibc into the image and running the programs, but it seems like I can't convert anything using nvidia as it would error out. I will have to revisit it again when I have more time sorry, but this is what you would add to the dockerfile
and it was from stackflow post from earlier. I'm not sure if i was correctly testing, but I was using the command |
what kind of ffmpeg error did you get? |
Hello, I would like to clarify my previous statement of "running the programs", I actually didn't run them per say, but actually just see if they could actually be called by just calling them from the prompt, calling
calling ffmpeg
and calling the above command to test
I'm not very versed in ffmpeg, but I hope that helps! TLDR your question |
Hm you got pretty far there, what command was it that you were running that's giving you that error message?
|
I was running |
Not sure if the preset could potentially be creating issues but a cleaner test command that eliminates the other encoders and removes any extra parameters is probably worthwhile
Additionally, can try with a bitrate
If that doesn't work then it probably still isn't correctly compiled |
So even with building glibc in, it doesn't work. it may be I need to both look at the dockerfile that jrottenberg does for alpine and also the nvidia docker file to see if building ffmpeg with alpine as the base on top of glibc inside that container would make it work... When I have more time I'll see if that'll work. |
So I did not have anymore luck trying to get an alpine capable ffmpeg build, so I attempted to just downgrade sonarr to the update right before the switch to alpine, which I found to be
It seems like something with this container does not like ffmpeg with nvenc. The container seems to have gpu access with commands like |
This should be working, I know other users before the alpine swtich had working NVEnc setups, here's a post for reference mdhiggins/sickbeard_mp4_automator#1444 Not sure that would be helpful to compare What jrottenberg tag were you using for ffmpeg? |
@KaHooli Just curious if you have encountered any solutions to getting nvenc ffmpeg builds on alpine since I know you were using a similar setup before they migrated from ubuntu |
Yeah that was why I'm stumped I've seen previous posts where it worked just fine for people but they were from last year so not sure if sonarr did something on their end. I've tried a lot of the tags, but the ones i've built with I believe is 4.3-nvidia2004 all the way to 5.0. I also tried to make it work on latest which was still on 1804 and those nvidia builds were a year old. I also built my own and tested it worked so the ffmpeg is probably not the issue but the container itself is doing something weird with the nvidia drivers I can only assume since using nvidia-docker2 passes your drivers to containers.
Using
and they're identical to both my jellyfin and plex containers. sonarr container os
jellyfin
plex
host os
As I'm writing this, I see that the sonarr is on 20.04.3 whereas 20.04.4 is what plex, jellyfin and my host is on. I'll try and see if I can get a 20.04.4 image for sonarr. |
I also have been running into this. Was finally able to get it working, with no changes needed to the underlying Dockerfile itself. For me there were two general issues: To address problem A, I went with the simple route. Instead of using To address problem B, there were four steps:
One issue that might be specific to my version was that Additionally, you might need to make sure your nvidia-docker is properly setup on the host machine, with the latest supported drivers for the version of ffmpeg you have installed. I am personally using ffmpeg 5.0.1, cuda 11.6, and nvidia driver 510.60.02. Even though my docker runtime is using 11.4, it still worked fine. After all of this, was able to verify by running manual.py and confirming the GPU ffmpeg usage with nvidia-smi. Here's my configs for reference: # docker-compose.yaml
sonarr:
restart: unless-stopped
build:
context: https://github.com/mdhiggins/sonarr-sma.git#build
args:
ffmpeg_tag: 5.0.1-nvidia2004
environment:
PUID: ${PUID}
PGID: ${PGID}
TZ: ${TZ}
SMA_HWACCEL: true
NVIDIA_VISIBLE_DEVICES: all
NVIDIA_DRIVER_CAPABILITIES: all
LD_LIBRARY_PATH: /usr/local/cuda-11.4/lib64
runtime: nvidia
security_opt:
- no-new-privileges:true
volumes:
- ./config/sonarr:/config
- ./config/sma:/usr/local/sma/config
- /mnt/storage/downloads:/downloads
- /mnt/storage/gmedia:/media ; autoProcess.ini
[Converter]
ffmpeg = ffmpeg
ffprobe = ffprobe
threads = 0
hwaccels = nvenc, cuvid, vaapi, dxva2, qsv, d3d11va
hwaccel-decoders = hevc_cuvid, h264_cuvid, mjpeg_cuvid, mpeg1_cuvid, mpeg2_cuvid, mpeg4_cuvid, vc1_cuvid, vp8_cuvid, vp9_cuvid, hevc_qsv, h264_qsv, hevc_vaapi, h264_vaapi
hwdevices = cuda:0
hwaccel-output-format = cuda:cuda
output-directory =
output-format = mp4
output-extension = mp4
temp-extension =
minimum-size = 0
ignored-extensions = nfo, ds_store
copy-to =
move-to =
delete-original = True
sort-streams = True
process-same-extensions = False
bypass-if-copying-all = False
force-convert = False
post-process = False
wait-post-process = False
detailed-progress = False
opts-separator = ,
preopts =
postopts =
regex-directory-replace = [^\w\-_\. ]
temp-output = False
[Permissions]
chmod = 0644
uid = -1
gid = -1
[Metadata]
relocate-moov = True
full-path-guess = True
tag = True
tag-language = eng
download-artwork = poster
sanitize-disposition =
strip-metadata = False
keep-titles = False
[Video]
codec = hevc_nvenc, hevc
max-bitrate = 0
bitrate-ratio =
crf = -1
crf-profiles =
preset =
codec-parameters =
dynamic-parameters = False
max-width = 0
profile =
max-level = 0.0
pix-fmt =
filter =
force-filter = False
prioritize-source-pix-fmt = True
[HDR]
codec =
pix-fmt =
space = bt2020nc
transfer = smpte2084
primaries = bt2020
preset =
codec-parameters =
filter =
force-filter = False
profile =
[Audio]
codec = ac3
languages =
default-language =
first-stream-of-language = False
allow-language-relax = True
channel-bitrate = 128
variable-bitrate = 0
max-bitrate = 0
max-channels = 2
prefer-more-channels = True
filter =
profile =
force-filter = False
sample-rates =
sample-format =
copy-original = False
aac-adtstoasc = False
ignore-truehd = mp4, m4v
ignored-dispositions =
unique-dispositions = False
stream-codec-combinations =
[Universal Audio]
codec = aac
channel-bitrate = 128
variable-bitrate = 0
first-stream-only = False
filter =
profile =
force-filter = False
[Audio.ChannelFilters]
6-2 = pan=stereo|FL=0.5*FC+0.707*FL+0.707*BL+0.5*LFE|FR=0.5*FC+0.707*FR+0.707*BR+0.5*LFE
[Subtitle]
codec = mov_text
codec-image-based =
languages =
default-language =
first-stream-of-language = False
encoding =
burn-subtitles = False
burn-dispositions =
embed-subs = True
embed-image-subs = False
embed-only-internal-subs = False
filename-dispositions = forced
ignore-embedded-subs = False
ignored-dispositions =
unique-dispositions = False
attachment-codec =
[Subtitle.CleanIt]
enabled = False
config-path =
tags =
[Subtitle.Subliminal]
download-subs = False
download-hearing-impaired-subs = False
providers =
[Subtitle.Subliminal.Auth]
opensubtitles =
tvsubtitles =
[Sonarr]
host = 127.0.0.1
port = 8989
apikey =
ssl = False
webroot =
force-rename = False
rescan = True
block-reprocess = False
[Radarr]
host = 127.0.0.1
port = 7878
apikey =
ssl = False
webroot =
force-rename = False
rescan = True
block-reprocess = False
[Sickbeard]
host = localhost
port = 8081
ssl = False
apikey =
webroot =
username =
password =
[Sickrage]
host = localhost
port = 8081
ssl = False
apikey =
webroot =
username =
password =
[SABNZBD]
convert = True
sickbeard-category = sickbeard
sickrage-category = sickrage
sonarr-category = sonarr
radarr-category = radarr
bypass-category = bypass
output-directory =
path-mapping =
[Deluge]
sickbeard-label = sickbeard
sickrage-label = sickrage
sonarr-label = sonarr
radarr-label = radarr
bypass-label = bypass
convert = True
host = localhost
port = 58846
username =
password =
output-directory =
remove = False
path-mapping =
[qBittorrent]
sickbeard-label = sickbeard
sickrage-label = sickrage
sonarr-label = sonarr
radarr-label = radarr
bypass-label = bypass
convert = True
action-before =
action-after =
host = localhost
port = 8080
ssl = False
username =
password =
output-directory =
path-mapping =
[uTorrent]
sickbeard-label = sickbeard
sickrage-label = sickrage
sonarr-label = sonarr
radarr-label = radarr
bypass-label = bypass
convert = True
webui = False
action-before =
action-after =
host = localhost
ssl = False
port = 8080
username =
password =
output-directory =
path-mapping =
[Plex]
host = localhost
port = 32400
refresh = False
token =
[Audio.Sorting]
sorting = language, channels.d, map, d.comment
default-sorting = channels.d, map, d.comment
codecs =
[Subtitle.Sorting]
sorting = language, d.comment, d.default.d, d.forced.d
codecs = |
This unfortunately will be a short-lived solution as the latest tag will probably be updated to alpine in the near future but I appreciate you sharing what you've learned |
It seems like until nvidia-docker supports alpine, those using hwaccel with nvidia gpu will need to stick to the last ubuntu release tag then. |
Looks like as of the recent develop branch commit they are rebasing sonarr back to ubuntu |
It looks like sonarr is rebased to focal. Does your post imply that nvenc should be "do-able" now? I have been poking away at it but can't get a functioning ffmpeg installed. I am using the nvidia runtime and can run nvidia-smi in the container. Thanks. |
What build config are you using? |
Been using this one:
ffmpeg -h Adding in extra_packages:
Yields: cont-init: info: running /etc/cont-init.d/90-sma-config Using:
Also returns the package 'libva' has no installation candidate error in the logs. And then this from ffmpeg -h ffmpeg: error while loading shared libraries: libnppig.so.11: cannot open shared object file: No such file or directory |
Hm see if trying the package |
you probably have to link the cuda libraries, as bradfrosty did earlier. |
@thebigbadoof had it. I needed to add in the library path. So I would say the issue is still there with the rebase but adding the path seems to be the workaround for 3 of us now. I am still getting the libva issue but I have a functioning ffmpeg in the container. Going to keep working on testing. I will start a new issue/thread for other issues. Thanks @thebigbadoof and @mdhiggins! |
hello,
I am trying to build sonarr-sma with ffmpeg nvidia, but am running into errors. I am attempting to build it with docker compose like this
being my last attempt. I initially built it with
ffmpeg_tag=4.4-nvidia
andsonarr_tag=latest
but ran into the issuewhen attempting to run ffmpeg or ffprobe. I read in mdhiggins/sickbeard_mp4_automator/issues/1470 that I needed to match the version so switched the tag to
ffmpeg_tag=4.4-nvidia1804
for it to be no joy. I attempted a few other tags only to run into the same issue. I could manually run ffprobe and ffmpeg if I went into the container and didexport LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/cuda/lib64
but did not know how to apply it when built. I attempted messing with the dockerfile but was unsuccessful.I then attempted to try to build with
sonarr_tag=develop
andffmpeg_tag=4.4-nvidia2004
as seen by a couple of other users in issues, but it looks like the develop branch has moved their image to alpine which does not seem to work with jrottenberg's nvidia images I attempted a few other tags but I kept getting the errors below.Attempting to run manual.py,
errors out like above. The only attempt I tried to do to fix this was this stackflow and adding
libc6-compat
as an extra package to install but to no avail.I have verified that the files
/usr/local/bin/ffprobe
and/usr/local/bin/ffmpeg
do exist in those locations as well.I don't know what else to attempt so any insight would be greatly appreciated.
The text was updated successfully, but these errors were encountered: