diff --git a/docs/integrations/claude-code.mdx b/docs/integrations/claude-code.mdx index 33c5351f1..d023612c7 100644 --- a/docs/integrations/claude-code.mdx +++ b/docs/integrations/claude-code.mdx @@ -2,6 +2,12 @@ title: Claude Code --- +Claude Code is Anthropic's agentic coding tool that can read, modify, and execute code in your working directory. + +Open models can be used with Claude Code through Ollama's Anthropic-compatible API, enabling you to use models such as `qwen3-coder`, `gpt-oss:20b`, or other models. + +![Claude Code with Ollama](https://files.ollama.com/claude-code.png) + ## Install Install [Claude Code](https://code.claude.com/docs/en/overview): @@ -27,21 +33,22 @@ Claude Code connects to Ollama using the Anthropic-compatible API. ```shell export ANTHROPIC_AUTH_TOKEN=ollama export ANTHROPIC_BASE_URL=http://localhost:11434 -export ANTHROPIC_API_KEY=ollama ``` 2. Run Claude Code with an Ollama model: ```shell -claude --model qwen3-coder +claude --model gpt-oss:20b ``` Or run with environment variables inline: ```shell -ANTHROPIC_AUTH_TOKEN=ollama ANTHROPIC_BASE_URL=http://localhost:11434 ANTHROPIC_API_KEY=ollama claude --model qwen3-coder +ANTHROPIC_AUTH_TOKEN=ollama ANTHROPIC_BASE_URL=http://localhost:11434 claude --model gpt-oss:20b ``` +**Note:** Claude Code requires a large context window. We recommend at least 32K tokens. See the [context length documentation](/context-length) for how to adjust context length in Ollama. + ## Connecting to ollama.com 1. Create an [API key](https://ollama.com/settings/keys) on ollama.com @@ -68,3 +75,4 @@ claude --model glm-4.7:cloud ### Local models - `qwen3-coder` - Excellent for coding tasks - `gpt-oss:20b` - Strong general-purpose model +- `gpt-oss:120b` - Larger general-purpose model for more complex tasks \ No newline at end of file