Learn how to interact with Nexus APIs, integrate with client libraries, and build applications using the LLM router and MCP servers.
- LLM API Usage - OpenAI-compatible API endpoints and examples
- MCP Integration - Connect Nexus to Claude Desktop, Claude Code, Cursor, and other AI agents
curl -X POST http://localhost:8000/llm/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "openai/gpt-4",
"messages": [{"role": "user", "content": "Hello!"}]
}'
curl -X POST http://localhost:8000/mcp/tools/filesystem/read_file \
-H "Content-Type: application/json" \
-d '{"path": "/home/user/document.txt"}'
from openai import OpenAI
client = OpenAI(
base_url="http://localhost:8000/llm/v1",
api_key="not-used"
)
response = client.chat.completions.create(
model="openai/gpt-4",
messages=[{"role": "user", "content": "Hello!"}]
)