AI Gateway
Route, monitor, and protect LLM requests through a unified gateway with cost tracking and guardrails.
17 articles
Getting started
Set up the AI Gateway from scratch: add a provider key, create an endpoint, test it, and start routing requests.
Analytics
Monitor LLM usage, costs, and guardrail activity across all providers.
Endpoints
Configure LLM provider endpoints with model selection, API keys, and system prompts.
Playground
Test endpoints with an interactive chat interface before routing production traffic.
Guardrails
Configure PII detection and content filtering rules to protect AI requests.
Gateway settings
Manage API keys, budget limits, and guardrail configuration.
Virtual keys
Generate API keys for developers to access the gateway with any OpenAI-compatible SDK.
Logs
View, filter, and inspect every request that flows through the AI Gateway.
Prompts
Create versioned prompt templates with variables, test them with streaming responses, and bind them to endpoints.
Models
Browse the full LLM model catalog, compare features side by side, and estimate monthly costs across providers.
MCP Gateway overview
Understand what the MCP Gateway does, how it works, and why you need it for agent governance.
MCP servers
Register backend MCP servers, configure authentication, and monitor health status.
MCP tool catalog
View discovered tools, assign risk levels, and enable approval requirements.
MCP agent keys
Create scoped API keys for AI agents with tool ACLs, rate limits, and expiration.
MCP audit log
Review tool invocation history with stats, charts, filters and pagination.
MCP approvals
Review and decide on pending tool execution requests that need human sign-off.
MCP guardrails
Configure PII detection, content filtering, and prompt injection rules for MCP tool inputs.