• 2 Posts
  • 4 Comments
Joined 2 years ago
cake
Cake day: June 23rd, 2023

help-circle
rss

  • If you already didn’t know, you can run locally some small models with an entry level GPU.

    For example i can run Llama 3 8B or Mistral 7B on a 1060 3GB with Ollama. It is about as bad as GPT-3 turbo, so overall mildly useful.

    Although there is quite a bit of controversy of what is an “open source” model, most are only “open weight”