产品动态
models.json 是一个配置文件,用于自定义模型列表和控制模型下拉列表的显示。该配置支持两个级别:~/.codebuddy/models.json - 全局配置,适用于所有项目<workspace>/.codebuddy/models.json - 项目特定配置,优先级高于用户级~/.codebuddy/models.json
<project-root>/.codebuddy/models.json
id 字段匹配)。availableModels 字段:项目级完全覆盖用户级,不进行合并。{"models": [{"id": "model-id","name": "Model Display Name","vendor": "vendor-name","apiKey": "sk-actual-api-key-value","maxInputTokens":200000,"maxOutputTokens":8192,"url": "https://api.example.com/v1/chat/completions","supportsToolCall": true,"supportsImages": true}],"availableModels": ["model-id-1", "model-id-2"]}
Array<LanguageModel>字段 | 类型 | 必填 | 说明 |
id | string | ✓ | 模型唯一标识符 |
name | string | - | 模型显示名称 |
vendor | string | - | 模型供应商 (如 OpenAI, Google) |
apiKey | string | - | API 密钥(实际密钥值,非环境变量名) |
maxInputTokens | number | - | 最大输入 token 数 |
maxOutputTokens | number | - | 最大输出 token 数 |
url | string | - | API 端点 URL (必须是接口完整路径,一般以 /chat/completions 结尾) |
supportsToolCall | boolean | - | 是否支持工具调用 |
supportsImages | boolean | - | 是否支持图片输入 |
supportsReasoning | boolean | - | 是否支持推理模式 |
url 字段必须是接口完整路径,一般以 /chat/completions 结尾https://api.openai.com/v1/chat/completions 或 http://localhost:11434/v1/chat/completionsArray<string>{"models": [{"id": "my-custom-model","name": "My Custom Model","vendor": "OpenAI","apiKey": "sk-custom-key-here","maxInputTokens":128000,"maxOutputTokens":4096,"url": "https://api.myservice.com/v1/chat/completions","supportsToolCall": true}]}
{"models": [{"id": "gpt-4-turbo","name": "GPT-4 Turbo (Custom Endpoint)","vendor": "OpenAI","url": "https://my-proxy.example.com/v1/chat/completions","apiKey": "sk-your-key-here"}]}
{"availableModels": ["gpt-4-turbo","gpt-4o","my-custom-model"]}
.codebuddy/models.json):{"models": [{"id": "project-a-model","name": "Project A Model","vendor": "OpenAI","url": "https://project-a-api.example.com/v1/chat/completions","apiKey": "sk-project-a-key","maxInputTokens":100000,"maxOutputTokens":4096}],"availableModels": ["project-a-model", "gpt-4-turbo"]}
~/.codebuddy/models.json (用户级)<workspace>/.codebuddy/models.json (项目级)models.json 添加的模型会自动标记 custom 标签,便于在 UI 中识别和过滤。SmartMerge 策略:availableModels 过滤在所有合并完成后执行url 字段一般以 /chat/completions 结尾。https://api.openai.com/v1/chat/completionshttps://api.myservice.com/v1/chat/completionshttp://localhost:11434/v1/chat/completionshttps://my-proxy.example.com/v1/chat/completions
https://api.openai.com/v1https://api.myservice.comhttp://localhost:11434
{"models": [{"id": "openai/gpt-4o","name": "open-router-model","url": "https://openrouter.ai/api/v1/chat/completions","apiKey": "sk-or-v1-your-openrouter-api-key","maxInputTokens":128000,"maxOutputTokens":4096,"supportsToolCall": true,"supportsImages": false}]}
{"models": [{"id": "deepseek-chat","name": "DeepSeek Chat","vendor": "DeepSeek","url": "https://api.deepseek.com/v1/chat/completions","apiKey": "sk-your-deepseek-api-key","maxInputTokens":32000,"maxOutputTokens":4096,"supportsToolCall": true,"supportsImages": false}]}
{"models": [{"id": "gpt-4o","name": "GPT-4o","vendor": "OpenAI","apiKey": "sk-your-openai-key","maxInputTokens":128000,"maxOutputTokens":16384,"supportsToolCall": true,"supportsImages": true},{"id": "my-local-llm","name": "My Local LLM","vendor": "Ollama","url": "http://localhost:11434/v1/chat/completions","apiKey": "ollama","maxInputTokens":8192,"maxOutputTokens":2048,"supportsToolCall": true}],"availableModels": ["gpt-4o","my-local-llm"]}
availableModels 中列出models 配置是否正确id, name, provider) 是否都已提供文档反馈