Ollama

The easiest way to run LLMs locally — supports 100+ models, 3 commands to get started

The most popular tool for running LLMs locally (165K+ ⭐, top 50 on GitHub globally). Supports Llama, DeepSeek, Gemma, Qwen, Phi, and 100+ other models. Dead-simple setup: curl install → ollama pull...

v7.0