From aa676b313fffcdaef34b107a943a72133eeb8fa1 Mon Sep 17 00:00:00 2001 From: Jeffrey Morgan Date: Sun, 16 Nov 2025 20:56:09 -0800 Subject: [PATCH] docs: link to ollama.com instead of hardcoding list of cloud models (#13110) --- docs/cloud.mdx | 10 ++-------- 1 file changed, 2 insertions(+), 8 deletions(-) diff --git a/docs/cloud.mdx b/docs/cloud.mdx index cea27216f..4f4c3722b 100644 --- a/docs/cloud.mdx +++ b/docs/cloud.mdx @@ -9,15 +9,9 @@ sidebarTitle: Cloud Ollama's cloud models are a new kind of model in Ollama that can run without a powerful GPU. Instead, cloud models are automatically offloaded to Ollama's cloud service while offering the same capabilities as local models, making it possible to keep using your local tools while running larger models that wouldn't fit on a personal computer. -Ollama currently supports the following cloud models, with more coming soon: +### Supported models -- `deepseek-v3.1:671b-cloud` -- `gpt-oss:20b-cloud` -- `gpt-oss:120b-cloud` -- `kimi-k2:1t-cloud` -- `qwen3-coder:480b-cloud` -- `glm-4.6:cloud` -- `minimax-m2:cloud` +For a list of supported models, see Ollama's [model library](https://ollama.com/search?c=cloud). ### Running Cloud models