Route Mode
Route mode sends requests containing PII to a local LLM. Requests without PII go to your configured provider.How It Works
Request with PII
Routed to Local LLM (Ollama, vLLM, llama.cpp, etc.)PII stays on your network.
Request without PII
Routed to Your Provider (OpenAI, Azure, etc.)Full provider performance.
When to Use
- Have local GPU resources
- Need complete data isolation for sensitive requests
- Must prevent any PII from leaving your network
Configuration
Routing Options
| Option | Description |
|---|---|
default | Provider for requests without PII |
on_pii_detected | Provider for requests with PII |
Local Provider Setup
Ollama
vLLM
llama.cpp
LocalAI
Response Headers
Route mode sets these headers on responses: When a request is routed to local:fallback_language: