Hi,

I wanted to run some Large Language Models locally. Something like Private GPT or Medium Article on my local Apple Silicon to enhance my privacy but also get some additional help.

Does anyone have recommendations or guides I could follow?

Thank you very much.

  • c10l@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 months ago

    On macOS I’ve been using Ollama. It’s very easy to setup, can run as a service and expose an API.

    You can talk to it directly from the CLI (ollama run ) or via applications and plugins (like https://continue.dev ) that consume the API.

    It can run on Linux but I haven’t personally tried it.

    https://ollama.ai/