User guideAI Gateway

AI Gateway

Route, monitor, and protect LLM requests through a unified gateway with cost tracking and guardrails.

17 articles

01

Getting started

Set up the AI Gateway from scratch: add a provider key, create an endpoint, test it, and start routing requests.

02

Analytics

Monitor LLM usage, costs, and guardrail activity across all providers.

03

Endpoints

Configure LLM provider endpoints with model selection, API keys, and system prompts.

04

Playground

Test endpoints with an interactive chat interface before routing production traffic.

05

Guardrails

Configure PII detection and content filtering rules to protect AI requests.

06

Gateway settings

Manage API keys, budget limits, and guardrail configuration.

07

Virtual keys

Generate API keys for developers to access the gateway with any OpenAI-compatible SDK.

08

Logs

View, filter, and inspect every request that flows through the AI Gateway.

09

Prompts

Create versioned prompt templates with variables, test them with streaming responses, and bind them to endpoints.

10

Models

Browse the full LLM model catalog, compare features side by side, and estimate monthly costs across providers.

11

MCP Gateway overview

Understand what the MCP Gateway does, how it works, and why you need it for agent governance.

12

MCP servers

Register backend MCP servers, configure authentication, and monitor health status.

13

MCP tool catalog

View discovered tools, assign risk levels, and enable approval requirements.

14

MCP agent keys

Create scoped API keys for AI agents with tool ACLs, rate limits, and expiration.

15

MCP audit log

Review tool invocation history with stats, charts, filters and pagination.

16

MCP approvals

Review and decide on pending tool execution requests that need human sign-off.

17

MCP guardrails

Configure PII detection, content filtering, and prompt injection rules for MCP tool inputs.

Back to user guide
AI Gateway - VerifyWise User Guide