I’d like to prevent building a completely new server just to run a gpu for AI workloads. Currently, everything is running on my laptop, except it’s dated cpu only really works well for smaller models.

Now I have an nvidia m40, could I possibly get it to work using thunderbolt and an enclosure or something? note: it’s on linux

  • poVoq@slrpnk.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 days ago

    There are external GPU cases that might work with your laptop, but at least on older models these were relatively bandwidth limited which doesn’t matter that much for gaming, but I guess it might cause more problems with AI workloads? On the other hand, maybe not if the model fits completely into the vRAM of the m40?

  • dingdongitsabear@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 days ago

    regarding the pricy enclosures, there are vastly cheaper eGPU solutions especially if you’re able to utilise the on-board M.2 or mini-PCI slot. if you don’t move the laptop around, it’s a viable option. this would be an example - not an endorsment. you’d need a $15 PSU to power the graphics and it works well in linux, with the hotpluggability being the primary issue; if you’re willing to shutdown before attaching the eGPU, close to no issues.

    you can run it as graphics card (i.e. utilize its display outputs) or just use the laptop’s display with optionally switching between the onboard and discrete graphics.

  • Possibly linux@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 days ago

    By the time you buy the enclosure you can get another old computer. Pickup a old workstation and put the GPU in it. Be mindful of power requirements