Any hardware transcoding advantages at all using "12th Gen Intel i7-1260P - Iris XE" CPU verses "Nvidia RTX 1080ti" GPU?
Help
I know both hardware transcoding solution can transcode many simultaneous streams (more than I even need). I also know that the Intel solution uses less power.
However, does the latest Intel solution support more media stream formats over the Nvidia GPU solution or is it the other way around? I couldnt find a comparison chart. I have both available to me.
Sort by:
Best
Open comment sort options
Comments Section
Intel's Quick Sync Video codec support matrix is here: https://en.wikipedia.org/wiki/Intel_Quick_Sync_Video, and Nvidia's is here: https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new. The biggest difference seems to be that the 12th gen Intel supports AV1 decode, while the 1080 Ti doesn't.
Thanks that was very helpful. Is QuickSync AV1 hardware transcoding even supported in Plex? Later this year, the RTX 4050 AD106 might have AV1 encoding capabilities.
I think you could likely work without the GPU and save on the power consumption. From what I have been reading these 12th Gen CPU's are pretty powerful and many are recommending the i5-12500.
Do you need to transcode and tonemap 4K HDR media?
If running Windows, you'll want to use the 1080.
If running Linux, you can use either.
Plex does not support hardware accelerated tonemapping using Intel graphics on Windows.
https://support.plex.tv/articles/hdr-to-sdr-tone-mapping/
Thanks. Im running Windows. That was very helpful. I remember not long ago only Jellyfin on Linux supported HDR tone mapping.... only a handful of people got it to work. Your post made my decision pretty easy.