LLM Proxy

Requests Today
0
PHI Protected
0
Active Sessions
0
Avg Latency
0ms

Quick Setup

Just change your base URL - that's it. Your PHI is automatically de-identified before reaching the LLM and re-identified in responses.

# Python - OpenAI SDK
import openai

client = openai.OpenAI(
    api_key="rph_your_api_key",  # Your RedactiPHI key
    base_url="https://llm.redact.health/v1/proxy/openai"  # Just change this!
)

# Use exactly as before - PHI is protected automatically
response = client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Summarize: Patient John Smith..."}]
)
# Response contains re-identified text - original PHI restored
Supported Providers
OpenAI Anthropic Azure OpenAI Google Gemini

Test My Setup

?
Authentication
Not tested yet
?
Organization
Not tested yet
?
Policy
Not tested yet
?
LLM Keys
Not tested yet
?
De-identification
Not tested yet
?
Re-identification
Not tested yet
?
Audit Logging
Not tested yet

Recent Requests

View All
No recent requests. Make your first LLM proxy request to see activity here.