LexVault is an AI that answers legal questions, analyzes contracts, and runs M&A due diligence — grounded in real law, verified before it reaches you, with a full audit trail. Runs entirely on your hardware. No data sent to OpenAI, ever.
Every component runs on-premise. Disconnect the ethernet cable and it still works.
No OpenAI, no Azure, no AWS. The LLM, embeddings, reranker, and translation all run locally on your hardware.
Deployed on an NVIDIA DGX Spark that fits on a desk. No server room, no IT department. All models run locally — 1,150 tokens/second. Always on.
Ask in English, search in German or Slovak statute text. The LLM generates search queries in the target language — zero extra latency, perfect legal terminology.
New case law and statute updates delivered monthly via secure USB drive. Plug in, wait 30 seconds, done. Your legal database stays current without ever connecting to the internet.
Five steps, fully on-premise. First results stream within seconds.
If the target jurisdiction uses a different language, the system translates the query. "tenant rights" becomes "Rechte der Mieter" for Austrian law.
Neural Translation · 450+ Languages · ~350msSearches statutes and case law using semantic meaning and exact legal terms simultaneously. Results merged and ranked by relevance.
Hybrid Vector Search · Semantic + Exact MatchA cross-encoder reads each query-document pair and scores relevance. Only results above threshold pass. Noise filtered out.
Cross-Encoder Reranker · GPU-acceleratedThe AI reads the relevant law and writes a grounded answer citing specific statutes and decisions. Temperature zero — facts only, no creativity.
Open-Source LLM · GPU-Accelerated · streamingAn independent LLM judge reviews the full response against sources at temperature zero. Unsupported claims are automatically rewritten grounded in source text. Claims that fail re-verification are removed.
LLM Judge · temp 0 · regenerate-or-removeResearch, analyze, run due diligence, and export — all in one platform, all on your hardware.
Ask questions in any language. The AI searches statutes and case law, cites specific sections, and verifies every claim before responding. Upload PDF, DOCX, or TXT files for context.
Upload an entire data room. The AI classifies documents, extracts clauses, flags risks across 4 tiers, cross-references warranties against disclosures, and verifies every finding against applicable statutes. Multi-party, multi-jurisdiction.
Upload a contract and get a clause-by-clause risk assessment. Each clause rated by importance with specific statute references. Large documents automatically chunked and analyzed.
Export research conversations, drafted contracts, and document analyses as professionally formatted DOCX or PDF files. Ready for client delivery.
Each lawyer gets their own login and conversation history. Admin dashboard with invite system, role management, usage stats, and seat control. Like ChatGPT Team, but on-premise.
Run parallel multi-source investigations across jurisdictions. Progressive streaming results. Automatic document chunking for large files. Cancel anytime.
When regulators ask "how did the AI reach that conclusion?", you have the answer. No other legal AI product offers this.
Every claim the AI makes is linked to the specific statute section or court decision it came from. Full source text attached — not just a citation number.
Before any response reaches the user, an independent LLM judge reviews the full answer against its sources at temperature zero. Unsupported claims are automatically rewritten grounded in source text. Claims that fail re-verification are removed entirely.
Every interaction recorded: model used, response time, sources searched, confidence score, verification result per claim (Verified, Flagged, Rewritten, Removed). Searchable, exportable, always available.
Sample audit entry: “The AI cited GmbH-Gesetz § 6 — Judge verdict: Verified — full source text: 1,847 chars — model: open-source LLM (temp 0) — response time: 1.2s — logged 2026-02-25 14:32:07”
Open-source models, runs entirely on your hardware, zero cloud services. End-to-end pipeline completes in under 15 seconds with response streaming from ~3s.
| Layer | Technology | Details |
|---|---|---|
| LLM | Open-Source Language Model | Streaming output, temperature zero for legal research |
| Embeddings | Multilingual Embeddings | Semantic search across 100+ languages |
| Reranker | Cross-Encoder Reranker | GPU-accelerated relevance scoring (~1s) |
| Translation | LLM-Native Translation | Search queries generated in target language by the LLM — zero extra latency |
| Verification | LLM Judge (temp 0) | Reviews full response against sources, regenerates or removes unsupported claims |
| Search | Hybrid Vector Search | Semantic + exact term matching, GPU-accelerated |
| Hardware | NVIDIA DGX Spark | 128 GB unified memory, 1,150 tok/s concurrent. Clusterable for larger firms. |
| Deployment | Pre-Installed Appliance | Plug in and go, no configuration needed |
Every legal AI runs in the cloud. Every one is a black box. LexVault is neither.
| Westlaw / CoCounsel | LexisNexis | Harvey AI | LexVault | |
|---|---|---|---|---|
| Offline / Data Never Leaves Your Building | — | — | — | Yes |
| Compliance Audit Trail | — | — | — | Yes |
| LLM Judge Verified Responses | — | — | — | Yes |
| AI Legal Assistant | Add-on | Add-on | Yes | Yes |
| M&A Due Diligence | — | — | — | Yes (on-premise) |
| Document Analysis | — | Basic | Yes | Yes |
| Multi-Jurisdiction | Per product | Per product | — | Yes |
| Cross-Language Search | — | — | — | Yes |
| Cost / Lawyer | $225–500 | €115–450 | $1,200+ | from €99 |
Same features. Same AI. Fraction of the price. No API costs, no overages, no cloud dependency.
| Solution | Monthly | 5-Year Total | You Save |
|---|---|---|---|
| Harvey AI ($1,200/seat) | ~€64,800 | ~€3,888,000 | €3,527,100 |
| Lexis+ AI Professional ($494) | ~€26,700 | ~€1,602,000 | €1,241,100 |
| CoCounsel All Access ($500) | ~€27,000 | ~€1,620,000 | €1,259,100 |
| CoCounsel Core ($225) | ~€12,120 | ~€727,200 | €366,300 |
| LexVault | €5,940 | €360,900 | — |
Based on published per-seat rates (USD at ~€0.90). Harvey: $1,200/seat/month (Sacra, 20-seat min). CoCounsel: published by Thomson Reuters. Lexis+ AI: published tiers. All competitors charge additional API/token costs. LexVault: unlimited queries, zero API costs — the LLM runs on your hardware.
Each jurisdiction you add reduces the rate on your entire bill — not just the new one.
| Jurisdictions | Discount | Rate per Jurisdiction | Total per Lawyer |
|---|---|---|---|
| 1 | — | €99 | €99 /month |
| 2 | 15% | €84 | €168 /month |
| 3 | 30% | €69 | €208 /month |
| 4+ | 45% (cap) | €54 | €218+ /month |
LexVault ships pre-installed on NVIDIA hardware. Fits on a desk, no server room required. Recommended: NVIDIA DGX Spark — 128 GB unified memory, 1,150 tok/s concurrent inference. Multiple units can be clustered for larger deployments.
Recommended hardware: NVIDIA DGX Spark (Grace Blackwell GB10, 128 GB LPDDR5X). 1,150 tokens/second concurrent via vLLM. Clusterable via ConnectX-7 400G for larger deployments. All models run locally — no cloud, no external API, no driver complexity. Hardware purchased separately at cost.
Pick the jurisdictions you need. Our multilingual pipeline supports 450+ languages — if a country publishes its law, we can add it.
| Jurisdiction | Coverage | Status |
|---|---|---|
| US Federal | Statutes (USC) + Regulations (CFR) | Available |
| US States | State statutes & regulations (per state) | On request |
| Austria | Core statutes & codes | Available |
| Slovakia | Core statutes & codes | Available |
| EU | Regulations, Directives & CJEU decisions | On request |
| Germany | Federal statutes & codes | Available |
| UK | Primary legislation & case law | On request |
| China | National laws & State Council regulations | On request |
| Others | Any jurisdiction with publicly available law | On request |