I assume it’s not nvidia. Yet I have no idea how to differentiate between them and neither do I know what a good price is.
Let’s say I don’t want to think about what the video type is. I just want a smooth experience.
Edit: thank you guys!
I hear good things about the Intel Arc A380. You basically only need it to convert video and the Intel is not too bad at that for not too steep a price
Thx. 130€? That’s surprisingly cheap.
It is not terribly performant but you don’t really need that for Jellyfin, and it’s a good value.
https://www.tomshardware.com/reviews/intel-arc-a380-review/5
Blows the 6950XT and 3090 out of the water in transcoding performance. I would say that is performing very well. That was before drivers have gotten much much better too probably a bigger difference now.
I have one, it is fantastic.
Someone said that it is “not terribly performent” but it doesn’t matter for transcoding. It can do multiple 4k streams of AV1 & HEVC. That is perfect.
According to benchmarks, it beat the 3080 and 6800XT when it was released for transcoding performance. That is what you have to look at in this case, you aren’t gaming on it.
Just remember to enable all of the correct kernel modules to get it working. You often have to manually download the firmware git repo and move it to the firmware folder in Debian to get it working.
How do I need to configure jellyfin in order to work properly?
I added
device: - /dev/dri
And I tried
/dev/dri/renderD128
but both don’t work. Moreover, I enabled encoding in HEVC format, hardware encoding and selected hardware accelleration with intel quicksync (QSV) and enabled hardware decoding for H264, HEVC, …But if that’s enabled, transcoding doesn’t work at all on the player.
I guess I fail at. any advice?
podman exec -it jellyfin /usr/lib/jellyfin-ffmpeg/vainfo Trying display: drm error: failed to initialize display
I managed to enable it by giving itnprivileged access.
Intel Arc A310. They’re $100, support AV1 and powered completely by the PCIe bus. Combine it with Tdarr and you can compress your media library down to half the size easily while still being able to easily stream to any device you have.
The most impressed I’ve been with hardware encoding and decoding is with the built in graphics on my little NUC.
I’m using a NUC10i5FNH which was only barely able to transcode one vaguely decent bitrate stream in software. It looked like passing the hardware transcoding through to a VM was too messy for me so I decided to reinstall linux straight on the hardware.
The hardware encoding and decoding performance was absolutely amazing. I must have opened up about 20 jellyfin windows that were transcoding before I gave up trying and called it good enough. I only really need about 4 maximum.
The graphics on the 10th generation NUC’s is the same sort of thing that is on the 9th gen and 10th gen desktop cpu’s, so if you have and intel cpu with onboard graphics give it a try.
It’s way less trouble than the last time I built a similar setup with NVidia. I haven’t tried a Radeon card yet, but the jellyfin docs are a bit more negative about AMD.
I didn’t even know jellyfin had hw transcode till this post but I’m with this guy, Intel’s qsv is great. I have my plex server running bare metal on an gen 2 HP chromebox. It’s dual core but hw transcode with Intel QSV will do like 20+ 1080p streams.
Intel integrated graphics is pretty phenomenal for ~5 user HTPC setups and NUC’s are basically the best Intel products ever. Nothing better than it just working out of the box.
This. I used a P1000 for transcoding and eventually switched to a 12th gen Intel chip with integrated UHD 770 graphics. It completely blew me away. Insanely low power draw and barely breaks a sweat transcoding multiple streams. Consider this route over a GPU if you can.
I’d look into AV1 decoding benchmarks, regardless of NVIDIA vs AMD, as I’ve been using NVIDIA on Jellyfin for a while with no issues.
HEVC is not as relevant IMO, as it’s not available through browsers due to license restrictions (ffmpeg / mpv works fine), so I’d focus on AV1 capabilities, which is not available in many cards.
I can’t get my nvidia to work with it :(
Intel integrated graphics or if you want to go overkill go with an Arc GPU.
Avoid AMD
Avoid AMD? Why do you say that?
It is terrible for media hardware acceleration. I’m saying that out of both personal experience and the Jellyfin wiki
I’ve been using my 6700XT for about 4 years with Emby and have had 0 issues
For all of my streaming devices it re encodes to h264 which AMD can easily do. Have yet to have any files fail to decode/encode
No good hardware acceleration for video.
Do you have first hand experience?
My 6700XT is calling bullshit
Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:
Fewer Letters More Letters Git Popular version control system, primarily for code NUC Next Unit of Computing brand of Intel small computers PCIe Peripheral Component Interconnect Express
3 acronyms in this thread; the most compressed thread commented on today has 5 acronyms.
[Thread #742 for this sub, first seen 8th May 2024, 06:15] [FAQ] [Full list] [Contact] [Source code]
I’m using nvidia right now with a 3060. It doesnt use much power, I got it for pretty cheap on ebay, and it encodes/decodes everything except for av1 encoding which I dont have use for. Looking at the charts in the link below, if you need to encode av1 you’ld need a 4000 series.
https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new
I’ve found nvidia to work pretty well for jellyfin, I use docker with the nvidia container toolkit and it just worked with hardware encoding out of the gate. I also have some other docker containers running gen ai and the 3060 handles them well as long as the modle will fit in vram.
That’s not power efficient
I think it entirely depends on your use case and hardware. I have a rack server, I need the extra power relatively frequently, as well as the 16x 2.5" bays and the 4 NICs. A rack server is a fairly power efficient package to get all those features in. However, it means that I am limited to discrete graphics, as Xeons don’t have Intel QSV. There’s also no monitor connected, and no 3D rendering happening, so the card is gonna idle at >5W and probably only use 20-30W while transcoding. Compared to a system that’s idling at ~250W that’s nothing.
deleted by creator
My RX580 does the job just fine. Does 1080p at 3x realtime for HEVC, and 10x for h.264.
They’re dirt cheap second hand.