Skip to main content

Continue

1.1 Usage

Add configuration in ~/.continue/config.json or project root continue/config.json:

{
"models": [
{
"name": "HPC-AI Tech",
"provider": "openai",
"model": "minimax/minimax-m2.5",
"apiBase": "https://api.hpc-ai.com/inference/v1",
"apiKey": "your-hpc-ai-api-key"
}
]
}

Method 2: Full Configuration Example

{
"name": "HPC-AI Tech",
"version": "0.0.1",
"schema": "v1",
"models": [
{
"name": "HPC-AI Tech",
"provider": "openai",
"model": "minimax/minimax-m2.5",
"apiBase": "https://api.hpc-ai.com/inference/v1",
"apiKey": "your-hpc-ai-api-key",
"defaultCompletionOptions": {
"maxTokens": 4000,
"temperature": 0.7
},
"roles": ["chat", "edit", "apply", "embed"]
}
],
"context": {
"providers": [
{ "provider": "problems" },
{ "provider": "tree" },
{ "provider": "url" },
{ "provider": "search" },
{ "provider": "folder" },
{ "provider": "codebase" },
{ "provider": "web" }
]
}
}

Method 3: Multiple Roles

{
"models": [
{
"name": "Chat Model",
"provider": "openai",
"model": "minimax/minimax-m2.5",
"apiBase": "https://api.hpc-ai.com/inference/v1",
"apiKey": "your-hpc-ai-api-key",
"roles": ["chat", "edit", "apply"]
},
{
"name": "Autocomplete Model",
"provider": "openai",
"model": "minimax/minimax-m2.5",
"apiBase": "https://api.hpc-ai.com/inference/v1",
"apiKey": "your-hpc-ai-api-key",
"roles": ["autocomplete"]
},
{
"name": "Embeddings Model",
"provider": "openai",
"model": "minimax/minimax-m2.5",
"apiBase": "https://api.hpc-ai.com/inference/v1",
"apiKey": "your-hpc-ai-api-key",
"roles": ["embed"]
}
]
}

Method 4: Legacy Completions Endpoint

For models that support /completions but not /chat/completions:

{
"models": [
{
"name": "HPC-AI Tech",
"provider": "openai",
"model": "minimax/minimax-m2.5",
"apiBase": "https://api.hpc-ai.com/inference/v1",
"apiKey": "your-hpc-ai-api-key",
"useLegacyCompletionsEndpoint": true
}
]
}

1.2 Troubleshooting

Common IssueSolution
Model not in dropdownVerify config.json format
Chat mode not supportedUse useLegacyCompletionsEndpoint: true

1.3 References