SuperAGI
1.1 Usage
Method 1: config.yaml (Recommended)
When deploying SuperAGI, edit config.yaml:
# config.yaml
# OpenAI-compatible API base URL
OPENAI_API_BASE: "https://api.hpc-ai.com/inference/v1"
# API Key
OPENAI_API_KEY: "sk-your-hpc-ai-api-key"
# Default model name
OPENAI_MODEL: "minimax/minimax-m2.5"
Restart SuperAGI:
docker compose up -d
Method 2: GUI Configuration
-
Start SuperAGI and open the Dashboard (default http://localhost:3000)
-
Go to Settings → Models → Custom Model
-
Fill in:
-
API Base URL:
https://api.hpc-ai.com/inference/v1 -
API Key:
sk-your-hpc-ai-api-key -
Model Name:
minimax/minimax-m2.5
-
-
Save
Method 3: Environment Variables
Set in docker-compose.yaml or .env:
# environment section in docker-compose.yaml
services:
superagi-backend:
environment:
- OPENAI_API_BASE=https://api.hpc-ai.com/inference/v1
- OPENAI_API_KEY=sk-your-hpc-ai-api-key
- OPENAI_MODEL=minimax/minimax-m2.5
1.2 Creating Agents
After configuration, when creating an Agent:
-
Go to Agents tab
-
Click Create Agent
-
Select the configured model under LLM Model
-
Set Agent goal, tools, etc.
-
Start Agent
1.3 Verify Configuration
# Check SuperAGI backend logs for API connectivity
docker compose logs superagi-backend --tail 50
1.4 References
-
Local model config issue: https://github.com/TransformerOptimus/SuperAGI/issues/411
-
SuperAGI Documentation: https://superagi.com/docs/Installation/