Ollama Now Supports Kimi-K2.5, GLM-5, MiniMax, and More Models

1 min read

Ollama continues expanding its local LLM runtime with support for a wave of new models including Kimi-K2.5, GLM-5, and MiniMax alongside staples like DeepSeek and Qwen. It's becoming the default way developers run models locally.

// share this

// get daily digest

Top 10 dev stories every morning at 8am UTC. AI-curated. Retro terminal HTML email.