Local AI Runners

Tools for running large language models locally on your own hardware

Local AI Runners — comparison of ollama, llama.cpp, gpt4all
Get up and running with Kimi-K2.5, GLM-5, MiniMax, DeepSeek, gpt-oss, Qwen, Gemma and other models.
LLM inference in C/C++
GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.
Popularity
Stars168,537103,05877,306
Global Rank#35#93#170
Weekly Activity
New Stars+323+397+32
Pushes21380
Issues Closed12160
Community
Forks15,51916,6758,336
Contributors5961,613122
Open Issues2,9071,451758
Project Info
Ownerollamaggml-orgnomic-ai
LicenseMITMITMIT
LanguageGoC++C++
CreatedJun 2023Mar 2023Mar 2023