Nexus is an AI Router that provides a unified endpoint for all your MCP (Model Context Protocol) servers, APIs, and LLM providers. It enables you to aggregate, govern, and manage your entire AI infrastructure through a single interface.
Nexus acts as a unified gateway between AI assistants and multiple Model Context Protocol (MCP) servers. Instead of configuring each MCP server individually in every AI tool, Nexus provides:
- Single endpoint for all your MCP tools
- Unified authentication with OAuth2 token forwarding
- Performance optimization through intelligent connection caching
- Tool aggregation with automatic namespacing to prevent conflicts
- Enterprise-ready security with TLS and authentication options


Get Nexus working with your AI assistant in minutes:
# Using the install script
curl -fsSL https://nexusrouter.com/install | bash
# Or run it with Docker
docker run -p 8000:8000 \
-v $(pwd)/nexus.toml:/etc/nexus.toml \
ghcr.io/grafbase/nexus:latest
Create a nexus.toml
file:
# Basic MCP configuration
[mcp]
enabled = true
path = "/mcp"
# Add a simple file system server
[mcp.servers.filesystem]
cmd = ["npx", "-y", "@modelcontextprotocol/server-filesystem", "/home/user/documents"]
# Add a GitHub MCP server
[mcp.servers.github]
url = "https://api.github.com/mcp"
[mcp.servers.github.auth]
token = "{{ env.GITHUB_TOKEN }}"
# Start with default settings
nexus
# Or specify a config file
nexus --config ./nexus.toml
- Open Cursor Settings (Cmd+, on macOS)
- Search for "Model Context Protocol"
- Enable MCP support
- Add to the MCP server configuration:
{
"nexus": {
"transport": {
"type": "http",
"url": "http://localhost:8000/mcp"
}
}
}
Nexus simplifies MCP integration by exposing just two tools to AI assistants:
search
- Discover tools from all connected MCP serversexecute
- Run any discovered tool
This design allows Nexus to aggregate tools from multiple servers without overwhelming the AI assistant with hundreds of individual tools.
- Support for STDIO (subprocess), SSE, and HTTP MCP servers
- Automatic protocol detection for remote servers
- Environment variable substitution in configuration
- Natural language tool discovery across all connected servers
- Fuzzy matching for finding relevant tools quickly
- Namespaced tools prevent conflicts between servers
- Avoid context bloating
- OAuth2 authentication with JWT validation
- Token forwarding to downstream servers
- TLS configuration for secure connections
- CORS and CSRF protection
- Single binary installation
- Docker support with minimal configuration
- Health checks and monitoring endpoints
claude mcp add --transport http nexus http://localhost:8000/mcp
Once connected, your AI assistant will see two tools:
- Use
search
to find available tools:search for "file read"
- Use
execute
to run them:execute filesystem__read_file with path "/home/user/documents/readme.md"
- Installation - Install Nexus using various methods
- Server Configuration - Configure the Nexus server and security settings
- MCP Configuration - Set up and manage MCP servers
- Consolidate multiple development tools into one interface
- Switch between local and remote MCP servers seamlessly
- Test and debug MCP server implementations
- Share a common set of tools across the organization
- Control access to sensitive tools with authentication
- Monitor tool usage and performance
- Implement governance and compliance for AI tool usage
- Secure sensitive operations with OAuth2 and token forwarding
- Scale MCP server deployments with load balancing
- GitHub Issues: github.com/grafbase/nexus/issues
- Discord Community: Join our Discord
- Documentation: You're already here!