Skip to main content

SuperAGI

1.1 Usage

When deploying SuperAGI, edit config.yaml:

# config.yaml

# OpenAI-compatible API base URL
OPENAI_API_BASE: "https://api.hpc-ai.com/inference/v1"

# API Key
OPENAI_API_KEY: "sk-your-hpc-ai-api-key"

# Default model name
OPENAI_MODEL: "minimax/minimax-m2.5"

Restart SuperAGI:

docker compose up -d

Method 2: GUI Configuration

  1. Start SuperAGI and open the Dashboard (default http://localhost:3000)

  2. Go to Settings → Models → Custom Model

  3. Fill in:

    • API Base URL: https://api.hpc-ai.com/inference/v1

    • API Key: sk-your-hpc-ai-api-key

    • Model Name: minimax/minimax-m2.5

  4. Save

Method 3: Environment Variables

Set in docker-compose.yaml or .env:

# environment section in docker-compose.yaml
services:
superagi-backend:
environment:
- OPENAI_API_BASE=https://api.hpc-ai.com/inference/v1
- OPENAI_API_KEY=sk-your-hpc-ai-api-key
- OPENAI_MODEL=minimax/minimax-m2.5

1.2 Creating Agents

After configuration, when creating an Agent:

  1. Go to Agents tab

  2. Click Create Agent

  3. Select the configured model under LLM Model

  4. Set Agent goal, tools, etc.

  5. Start Agent

1.3 Verify Configuration

# Check SuperAGI backend logs for API connectivity
docker compose logs superagi-backend --tail 50

1.4 References