Ollama

The easiest way to run LLMs locally — supports 100+ models, 3 commands to get started

The most popular tool for running LLMs locally. Supports Llama, DeepSeek, Gemma, Qwen, Phi, and 100+ other models. Dead-simple setup: curl install → ollama pull → ollama run. Automatically manages...

v7.0