Hello c/Selfhosted!

Although I’m still new with truenas, I’ve been a happy truenas scale hoster for a year more or less and I’ve been increasing the reach of my self hosted server little by little.

The problem came when I decided to add jellyfin and a GPU for encoding. My server is mostly made of old parts and the GPU is not different. The GPU is recognized by truenas scale as a “Advanced Micro Devices, Inc. [AMD/ATI] Cape Verde PRO [Radeon HD 7750/8740 R7 250E”, which AFAIK has hardware encoding/decoding as per Jellyfin wiki.

But the only place I can see the GPU is in lspci and in System Settings/Isolated GPU PCI Ids (and it’s not isolated). Whenever I try to change the configuration of an app to allocate the GPU I can only select “Allocate 0 amd.com/gpu GPU”, there are no more options.

I’ve searched for this a lot but I found very little info about AMD GPUs and how to debug this issue.

I’am missing something? Could anybody point me in the right direction? Any commands I can run to diagnose?

Thanks for reading!

  • Vik@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 months ago

    Since this is GCN based, you may be able to use the newer AMDGPU kernel driver? I’m not sure about that specific SKU, however. I remember that using AMDGPU on, for example, Hawaii (like the R9 290) was particularly finicky

    • Fenixin@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      Not sure how to change the driver to AMDGPU, I blacklisted the Radeon driver but the kernel didn’t load the other one. I read somewhere that I have to do a initframs update but the command doens’t exist in truenas scale. How do I force to load the other driver?

      • chameleon@fedia.io
        link
        fedilink
        arrow-up
        3
        ·
        2 months ago

        For that card, you probably have to set the radeon.si_support=0 amdgpu.si_support=1 kernel options to allow amdgpu to work. I don’t have a TrueNAS system laying around so I don’t know what the idiomatic way to change them is.

        Using amdgpu on that card has been considered experimental ever since it was added like 6 years ago, and nobody has invested any real efforts to stabilize it. It’s entirely possible that amdgpu on that card is simply never gonna work. But yeah I think the radeon driver isn’t really fully functional anymore either, so I guess it’s worth a shot…

        • vividspecter@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 months ago

          Using amdgpu on that card has been considered experimental ever since it was added like 6 years ago

          If I recall right, it hasn’t been enabled by default simply because it is missing some features like analog TV out support (which most people don’t want or need in 2024).