Alpha · Coming soon

Nothing leaves uncovered.

Route any prompt to any model — OpenAI, Anthropic, Mistral, Gemini or your stack — with a local pipeline that strips, masks and re-hydrates personal data. The upstream model sees pseudonyms; your firm sees the real names.

OpenAI-compatible gateway · API and UI

What leaves your perimeter

████████ ████ advised ██████████ on the █████ merger.

What the model sees

Person_12 and Org_4 advised Client_A on the Northern deal.

What your team sees back

Maria García and Cuatrecasas advised Acme Corp on the Nordic merger.

Built for regulated work

GDPR-aligned design
EU AI Act readiness
Schrems II & transfer risk
Attorney–client privilege posture

Five stages, one perimeter

Deterministic scrubbing first, then models, then an adversarial judge. Answers come back through the same vault so names and entities reappear only for you.

Local execution · Tunable strictness · Audit trail

Infographic: five local stages—scrubber, NER, vault, rewriter, judge—send pseudonymized prompts to external LLM providers; a re-hydrate path brings real names back through the vault.
Everything sensitive stays in your perimeter until you choose what crosses the line—and answers return through the same vault.
01

Deterministic scrubber

02

Discriminative NER

03

Reversible vault

04

SLM rewriter

05

Adversarial judge

06

Your LLM provider

Re-hydrate response· Deterministic per-session tokens: the same person stays Person_12 across turns. The vault re-hydrates the model reply before anyone reads it.

Inside the pipeline

Each stage has a job. Together they keep quasi-identifiers and edge cases from slipping through.

01

Deterministic scrubber

Regex and dictionaries for NIF/DNI, IBAN, case numbers, court references and other patterns you can explain to a regulator.

02

Discriminative NER

Fine-tuned encoder models spot people, organizations, locations and legal entities where rules alone miss context.

03

Reversible pseudonym vault

Deterministic per-session tokens: the same person stays Person_12 across turns. The vault re-hydrates the model reply before anyone reads it.

04

Generative SLM rewriter

A small model tuned with RL handles indirect identifiers and phrasing that still points to a single individual.

05

Adversarial judge SLM

A second small model attacks the masked text for re-identification risk and scores k-anonymity / l-diversity style leakage before anything is sent upstream.

Policy as code

Tune strictness per matter, client or API key — not by retraining. Legal sets the policy; engineering ships the binary.

OpenAI-compatible drop-in
matter_id: M-2025-014 strictness: high entities: [PERSON, ORG, CASE_ID, IBAN] allow_foreign_models: true judge_min_score: 0.92

Example policy surface

Route to the models you already use

Same gateway, multiple backends. Swap providers without changing your anonymization story.

OpenAI
Anthropic
Google Gemini
Mistral
Cohere
AWS Bedrock
Azure OpenAI
Ollama
vLLM

Where Veil earns its place

From litigation to public-sector disclosure — any workflow where a name in the wrong place is the whole problem.

Litigation drafting

Draft and research with LLMs without client names or opponent details crossing your perimeter.

  • Pseudonyms for parties, counsel and courts in every upstream prompt
  • Consistent mapping across a long thread so the model does not get confused
  • Re-hydration only inside your review surface

GDPR DSR / SAR responses

Generate summaries and extracts for data subjects without leaking other individuals in the same file set.

  • Entity-level masking tuned for multi-party records
  • Judge pass catches indirect references before export
  • Policy variants for HR, health and finance record mixes

M&A due diligence

Let deal teams use frontier models on counterparty and employee data under a controlled pseudonym layer.

  • Vault keeps code names stable across thousands of pages
  • Swap from Claude to GPT without redoing redaction logic
  • Audit log of what left the perimeter and when

Clinical notes (HIPAA-minded)

Summarize and structure clinical text for operations while PHI stays tokenized for external models.

  • Strong entity coverage for providers, patients and facilities
  • Local pipeline keeps raw notes off third-party training paths
  • Strictness presets for research vs. operations

Finance / KYC analysis

Run LLM-assisted review on transactions and customer files with account tokens instead of live identifiers.

  • IBAN, tax IDs and internal customer handles in the scrubber layer
  • Per-desk policies for trading vs. compliance
  • Re-hydration gated behind your existing access control

Public sector / FOIA

Prepare releases and internal briefings with consistent redaction before any cloud model sees the text.

  • Citizen and official identifiers masked to stable tokens
  • Schrems-minded routing: choose EU-only backends per policy
  • Documentation-friendly: explain scrubber rules in plain language

Join the waitlist

Alpha access, implementation partners and design reviews. Tell us who you are and what you want to run through Veil.

Your details go straight to contact@minerva-ds.com. We never sell, share, or train on them.

Indicative — final at GA

How pricing will look

Open core first. Hosted and enterprise when you need someone else to run the judge models and SLAs.

Community / OSS

Free

Self-host the gateway, models and policies. Full source for the pipeline and UI.

  • ·Open-source core under a permissive license
  • ·Run on your hardware or VPC
  • ·Community support and docs

Hosted

From €299 / month

Managed multi-tenant gateway with upgrades, monitoring and predictable scale.

  • ·Minerva-operated control plane
  • ·Included minor version updates
  • ·Standard support windows

Enterprise

Custom

Private cloud, BYOK, on-prem judge and rewriter SLMs, custom policies and contractual terms.

  • ·Dedicated or VPC deployment
  • ·SLAs and security review pack
  • ·Professional services for legal and compliance alignment

Be first in line

We are onboarding a small set of firms and platform teams for the alpha.