LangGraph
1.1 API Configuration
LangGraph extends LangChain and connects to LLMs via ChatOpenAI. Set openai_api_base to point to any OpenAI-compatible endpoint:
Basic Configuration
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
openai_api_base="https://api.hpc-ai.com/inference/v1",
api_key="sk-your-hpc-ai-api-key",
model="minimax/minimax-m2.5"
)
Environment Variables
import os
from langchain_openai import ChatOpenAI
os.environ["OPENAI_API_BASE"] = "https://api.hpc-ai.com/inference/v1"
os.environ["OPENAI_API_KEY"] = "sk-your-hpc-ai-api-key"
llm = ChatOpenAI(model="minimax/minimax-m2.5")
1.2 Using in LangGraph
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI
# Configure custom LLM
llm = ChatOpenAI(
openai_api_base="https://api.hpc-ai.com/inference/v1",
api_key="sk-your-hpc-ai-api-key",
model="minimax/minimax-m2.5",
temperature=0.7
)
# Create agent
agent = create_react_agent(llm, tools=[])
# Invoke agent
result = agent.invoke({"messages": [("user", "Hello")]})
print(result)
1.3 References
- LangChain Integration: https://python.langchain.com/docs/integrations/llms/openai/