Skip to content

VaultBytes/CipherExplain

Repository files navigation

CipherExplain — Encrypted Explainable AI

License: AGPL-3.0-or-later (see LICENSE) Patents: PCT/IB2026/053378, PCT/IB2026/053405 (see NOTICE) Managed service: vaultbytes.com/cipherexplain


The Problem

Regulations are colliding. GDPR and HIPAA require data to stay encrypted. The EU AI Act (Article 13) requires AI systems to explain their decisions with per-feature attributions. A credit bureau cannot decrypt applicant data to compute an explanation — but it must provide one.

CipherExplain solves this: it computes SHAP feature attributions entirely under Fully Homomorphic Encryption. The server never sees plaintext. The client gets a complete explanation of which features drove the prediction — all on encrypted data.

What's In This Repo

Component What it does Patent
CipherExplain Encrypted SHAP explanations — compresses 2^50 coalition evaluations to 390 via deterministic sampling, packs them into 2 CKKS ciphertexts via SIMD, and collapses Shapley regression to a single homomorphic matrix-vector multiply PCT/IB2026/053405
FHE Testing Oracle Adversarial precision testing for FHE circuits — uses CMA-ES evolutionary search to find inputs that maximize plaintext-vs-ciphertext divergence, catching bugs that random testing misses PCT/IB2026/053378

Why adversarial testing matters

FHE precision bugs are input-dependent. A circuit can pass 99.99% of random inputs and fail catastrophically on specific inputs that trigger noise resonance. In our benchmark, random testing (500 inputs) found zero bugs. Adversarial CMA-ES search (520 evaluations, same budget) found 268 diverging inputs with an error 3,008x larger — the difference between "looks fine" and "production failure."


Quick Start

Python SDK

PyPI Python License: AGPL v3 Downloads

pip install cipherexplain              # base client
pip install 'cipherexplain[fhe]'       # + OpenFHE for local CKKS
from cipherexplain_sdk import CipherExplainClient, extract_spec

client = CipherExplainClient(api_key="vb_...")

# Register any linear classifier (weights only — no data sent)
spec = extract_spec(model, "my_model", feature_names, scaler=scaler)
client.register(spec)

# Get SHAP explanation
result = client.explain_raw("my_model", x_raw)
print(result["shap_values"])

Full docs: sdk/README.md.

CipherExplain API

import requests

BASE = "https://cipherexplain.vaultbytes.com"
HDR  = {"X-API-Key": "vb_your_key"}

# Load demo model
requests.post(f"{BASE}/startup", headers=HDR)

# Explain a prediction (plaintext path)
r = requests.post(f"{BASE}/explain_raw", headers=HDR,
    json={"model_id": "credit_model", "features": [38, 13, 0, 0, 40],
          "fhe_mode": "disable"})

print(r.json())

FHE Modes

fhe_mode What runs under encryption Latency Use case
"disable" Nothing — plaintext sklearn < 5 ms Development, tree models
"simulate" Quantization-correct simulation (no encryption) ~200 ms Staging, accuracy validation
"execute" Real TFHE encryption via Concrete ML ~700 ms Production FHE (d <= 10)
"ckks" Full CKKS pipeline — masking, logreg, regression all encrypted ~10 s Production FHE (d = 50, K = 390)

FHE Testing Oracle

docker run vaultbytes/fhe-oracle \
  --adapter-module my_circuit.py \
  --time-budget 300 \
  --divergence-threshold 1e-6

GitHub Action:

- name: CipherExplain
  uses: VaultBytes/CipherExplain@v1
  with:
    circuit-file: src/my_fhe_circuit.py
    divergence-threshold: 1e-6
    time-budget: 120
    api-key: ${{ secrets.VAULTBYTES_API_KEY }}

Key Numbers

Metric Value
SHAP accuracy MAE vs KernelSHAP 0.009
Efficiency axiom error 0.0 (machine epsilon)
Coalition compression (d=50) 2^50 -> 390 evaluations
CKKS latency (d=50, M1) ~10 s end-to-end
Additional CKKS depth +2 levels

Reproduce the Benchmarks

git clone https://github.com/VaultBytes/CipherExplain
cd CipherExplain

python -m venv .venv && source .venv/bin/activate
pip install -r cipherexplain/requirements.txt cma

# Patent specification benchmarks
cd cipherexplain && python benchmark.py

# 12-claim validation suite
cd fhe_shap_tests && python run_all_tests.py

# Adversarial testing oracle benchmark
cd ../cipherexplain && python oracle_benchmark.py

Repository Structure

cipherexplain/           CipherExplain server (FastAPI)
  api/                   Route handlers
  core/                  SHAP engine, FHE operations
  fhe_shap_tests/        12-claim patent validation suite
  middleware/             Auth, metering, rate limiting
  tests/                 API integration tests

fhe-testing-oracle/      FHE Differential Testing Oracle
  adapters/              Zama, OpenFHE, SEAL, TFHE backends
  benchmarks/            Vendor circuit benchmarks
  tests/                 Adapter tests

sdk/                     Python client SDK
examples/                Sample FHE circuits

API Reference

All endpoints require X-API-Key header except /health, /docs, and /signup/*.

Models

Method Endpoint Description
GET /health Liveness check
GET /models List registered models
POST /startup Load demo credit model
POST /models/register Register model weights
DELETE /models/{model_id} Delete a registered model

Explain

Method Endpoint Description
POST /explain SHAP explanation (pre-scaled features)
POST /explain_raw SHAP explanation (auto-scaled raw features)
POST /report Generate PDF audit report

Account

Method Endpoint Description
GET /usage Monthly quota and usage
POST /keys/rotate Rotate API key

Patent Notice

Portions of this software are covered by pending patent applications:

  • PCT/IB2026/053378 — FHE Differential Testing Oracle
  • PCT/IB2026/053405 — Homomorphic Encrypted SHAP

The AGPL-3.0 license grants rights to use, modify, and distribute the source code. It does not grant a license under any patent claims. See NOTICE for details.

For patent licensing: b@vaultbytes.com For commercial (non-AGPL) licensing: b@vaultbytes.com


EU AI Act Compliance

The EU AI Act (Article 13) requires transparent feature-level explanations for high-risk AI decisions. CipherExplain produces per-feature SHAP attributions under FHE — input data is never decrypted on the server during inference.


Copyright (C) 2026 Bader Issaei / VaultBytes Innovations Ltd

About

Encrypted Explainable AI — SHAP feature attributions under Fully Homomorphic Encryption. Patent pending: PCT/IB2026/053378, PCT/IB2026/053405

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors