From b816ff86c923e0290f58f2275e831fc17c29ba37 Mon Sep 17 00:00:00 2001 From: Parth Sareen Date: Wed, 26 Mar 2025 17:34:18 -0700 Subject: [PATCH] docs: make context length faq readable (#10006) --- docs/faq.md | 8 +++++++- 1 file changed, 7 insertions(+), 1 deletion(-) diff --git a/docs/faq.md b/docs/faq.md index 66959cca7..f418da47f 100644 --- a/docs/faq.md +++ b/docs/faq.md @@ -20,7 +20,13 @@ Please refer to the [GPU docs](./gpu.md). ## How can I specify the context window size? -By default, Ollama uses a context window size of 2048 tokens. This can be overridden with the `OLLAMA_CONTEXT_LENGTH` environment variable. For example, to set the default context length to 8K, use: `OLLAMA_CONTEXT_LENGTH=8192 ollama serve`. +By default, Ollama uses a context window size of 2048 tokens. + +This can be overridden with the `OLLAMA_CONTEXT_LENGTH` environment variable. For example, to set the default context window to 8K, use: + +```shell +OLLAMA_CONTEXT_LENGTH=8192 ollama serve +``` To change this when using `ollama run`, use `/set parameter`: