-
@ jamw
2025-04-29 21:38:16Just had a quick look into this and it seems possible to do this for free. Ie to run open-source models on the Mac like LLama 2, Mistral, or Phi-2 locally using Ollama. No internet, no API keys, no limits and Apple Silicon runs them well.