Demo 12 — `ai-classifier` plugin
Demo 12 — ai-classifier plugin
Module brief: §Module 12
UVP
Tag LLM-generated SQL at the proxy via
application_namekeywords + generated-by markers. Downstream policy plugins (token-budget, llm-guardrail) read the tag.
Use cases
- Cost attribution. “How much DB load is coming from agent traffic vs human traffic?” — tag-based query metrics.
- Compliance. Auditors need to prove non-deterministic LLM output didn’t escape the schema; the tag lets you slice the audit log by traffic source.
What this demo shows
Three connection styles, three classification outcomes:
| application_name | classified | agent_id | model_id |
|---|---|---|---|
psql | no | — | — |
gpt-shopper | yes | gpt-shopper | — |
ClaudeAgent-claude-opus-4-7 | yes | ClaudeAgent-claude-opus-4-7 | claude-opus |
Plus a SQL with a /* generated by GPT-4 */ marker → also
classified, model guess gpt-4.
Run it
cd demos/v0.4.0/12-ai-classifier./demo.shWalks through the four cases, prints the per-request_id KV keys
the classifier wrote (visible via /admin/kv/ai-classifier/).
Implementation pointer
HDB-HeliosDB-Proxy-Plugins/ai-classifier/src/lib.rs. Pure-function
classifier in classify; KV writes in pre_query. Six unit tests
cover each detection path.
HeliosDB compatibility
Backend-agnostic.