diff --git a/docs/cloud.mdx b/docs/cloud.mdx index cea27216f2..4f4c3722b9 100644 --- a/docs/cloud.mdx +++ b/docs/cloud.mdx @@ -9,15 +9,9 @@ sidebarTitle: Cloud Ollama's cloud models are a new kind of model in Ollama that can run without a powerful GPU. Instead, cloud models are automatically offloaded to Ollama's cloud service while offering the same capabilities as local models, making it possible to keep using your local tools while running larger models that wouldn't fit on a personal computer. -Ollama currently supports the following cloud models, with more coming soon: +### Supported models -- `deepseek-v3.1:671b-cloud` -- `gpt-oss:20b-cloud` -- `gpt-oss:120b-cloud` -- `kimi-k2:1t-cloud` -- `qwen3-coder:480b-cloud` -- `glm-4.6:cloud` -- `minimax-m2:cloud` +For a list of supported models, see Ollama's [model library](https://ollama.com/search?c=cloud). ### Running Cloud models