docs: update default context window size to 4096 tokens (#13709)

This commit is contained in:
Maternion
2026-01-14 14:31:28 +05:30
committed by GitHub
parent 7d411a4686
commit 875cecba74

View File

@@ -22,7 +22,7 @@ Please refer to the [GPU docs](./gpu).
## How can I specify the context window size?
By default, Ollama uses a context window size of 2048 tokens.
By default, Ollama uses a context window size of 4096 tokens.
This can be overridden with the `OLLAMA_CONTEXT_LENGTH` environment variable. For example, to set the default context window to 8K, use: