(sorry if anyone got this post twice. I posted while Lemmy.World was down for maintenance, and it was acting weird, so I deleted and reposted)

  • EmoBean@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    10 months ago

    It’s pretty much all about your gpu vram size. You can use pretty much any computer if it has a gpu(or 2) that can load >8gb into vram. It’s really not that computation heavy. If you want to keep a lot of different llms or larger ones, that can require a lot of storage. But for your average 7b llm you’re only looking at ~10gb hard storage.

    • Fisch@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      I have an AMD GPU with 12gb vram but do they even work on AMD GPUs?