Assay Proof Gallery

AI systems are making decisions and taking actions. Sooner or later, someone asks: show me the proof.

Example proof packs generated by the real Assay toolchain. Locally verifiable. No account, no platform access, no trust in the vendor required.

That question — can you prove it? — comes from customers, auditors, regulators, compliance teams, security reviewers, and internal leadership.

Most teams answer with logs on their own servers, screenshots, dashboards, policy documents, and selectively presented evidence. All of it depends on trusting the vendor.

The core problem: there is no artifact a company can hand over that an outsider can independently verify. A company says "our AI controls ran" but cannot produce proof that someone else can check.

Assay is an evidence compiler. It records the important execution events, checks, and decisions during an AI workflow and packages them into a proof pack — a small, portable, cryptographically signed folder that anyone can verify offline.

Assay exists to bridge the gap between what a company says happened and what an outsider can independently verify.

Assay doesn't make fraud impossible. It makes post-hoc tampering, silent weakening, and selective evidence presentation much harder to get away with:

Assay proves the evidence artifact has not been quietly changed after the fact. It does not, by itself, prove every upstream component was honest. Stronger deployment patterns — CI-held signing keys, transparency logs, external timestamping — raise the cost of full fabrication further.

Verify a real sample proof pack in your browser. No install. Nothing uploaded.

Client-side browser verification covers signed proof packs only. Reviewer packets remain CLI-only via assay reviewer verify.

A proof pack is a small signed folder created from an AI workflow execution. It is the thing a buyer, auditor, or reviewer can actually hold, inspect, forward, and verify.

receipt_pack.jsonl The evidence trail. Line-by-line receipts of what happened during the run: model calls, checks, verdicts, capability uses.
pack_manifest.json The inventory sheet. Lists every file and its fingerprint, plus the pack identity and signer information.
pack_signature.sig The digital wax seal. Proves the manifest was signed by the pack creator and not changed afterward.
verify_report.json The machine-readable verdict. Structured results for software, automation, and CI pipelines.
verify_transcript.md The human-readable summary. The same verification story, written for people. Forwardable to auditors as-is.
1
Every file gets a fingerprint (a hash). Change one byte and the fingerprint changes.
2
The manifest records the expected fingerprints for every file in the pack.
3
The manifest is digitally signed with the signer's private key.
4
The verifier recomputes fingerprints and checks the signature. If anything changed, verification fails.

If a file was changed, the fingerprint won't match. If someone edits the manifest to cover it, the signature breaks. That is how tampering is detected.

PASS Authentic evidence, declared standards pass. The pack is intact, the signature is valid, and all claimed checks succeeded.
HONEST FAIL Authentic evidence, declared standards fail. The pack is intact, but the system violated its own standard. The failure is sealed — it cannot be rewritten after signing. A signed failure is stronger evidence than a vague pass.
TAMPERED Evidence changed after signing. One or more files were modified. The pack cannot be trusted as authentic evidence.

Seven public artifacts from the real toolchain: three proof-pack verdicts, one reviewer packet, one insurance vertical mapping, one MCP proxy scenario, and one customer-data-boundary tamper diagnostic.

FintechCo Loan Approval
PASS
Five receipts across model calls, a guardian verdict, and a capability use. Integrity intact. All declared claims pass. This is what a clean, verifiable AI run looks like.
Details
assay verify-pack ./gallery/01-fintech-pass/proof_pack
IntegrityPASS
ClaimsPASS
Receipts5
Exit0
6e6b34e48dd13ae0… — signature_verified
InsuranceTech Claims Review
HONEST FAIL
The evidence is authentic and untampered. It proves the system missed its declared coverage standard: 3 receipts recorded, 10 required. The failure was sealed at runtime. Failure is provable, not suppressible.
Details
assay verify-pack ./gallery/02-insurance-honest-fail/proof_pack --require-claim-pass
IntegrityPASS
ClaimsFAIL
Receipts3 of 10
Exit1
d2dfa04aed0697cf… — signature_verified
DataCo Analytics — Tamper Detection
TAMPERED
A clean proof pack with one byte changed in the receipt file. The manifest and signature are untouched — the SHA-256 mismatch is all the verifier needs to detect the tamper. Anyone can detect altered evidence offline.
Details
assay verify-pack ./gallery/03-tamper-demo/good # exit 0 assay verify-pack ./gallery/03-tamper-demo/tampered # exit 2
CleanPASS
TamperedTAMPERED
Exit2
clean: d664115a7aaaedfb… — signature_verified
AcmeSaaS Support Workflow Reviewer Packet
VERIFIED WITH GAPS
A buyer-facing reviewer packet wrapped around a signed proof pack. The nested proof verifies cleanly, but packet settlement remains a separate review layer verified via CLI. This is the handoff artifact a buyer can forward and challenge.
Details
assay reviewer verify ./gallery/05-reviewer-packet-gaps/reviewer_packet assay verify-pack ./gallery/05-reviewer-packet-gaps/reviewer_packet/proof_pack
PacketVERIFIED_WITH_GAPS
Nested proofPASS
Coverage1 evidenced / 1 partial
Packet warns that the packet manifest is unsigned; the nested proof pack remains signed and verifiable.
Insurance vertical mapping
NAIC AISET Questionnaire Packet
VERIFIED WITH GAPS
14 AISET-aligned questions mapped to proof-pack evidence. 8 are machine-provable. 3 require human attestation. 2 are explicitly out of scope. The first verifiable evidence format mapped to the insurance AI questionnaire shape.
Details
assay vendorq export-reviewer \ --proof-pack ./proof_pack \ --boundary ./boundary.json \ --mapping src/assay/mappings/naic_aiset/question_mapping.json \ --out ./naic_aiset_packet
Evidenced8
Partial1
Human-attested3
Out of scope2
Agent / tool-call scenario
LogisticsCo MCP Notary Proxy
PASS
An AI agent calls three MCP tools through the Assay MCP Notary Proxy. Every tool call is receipted with arguments, results, timing, and server identity.
Details
assay verify-pack ./gallery/04-mcp-notary-proxy/proof_pack
MCP toolsweather, inventory, risk
Exit0
Boundary / tamper diagnostic
Customer-Data Boundary Crash Test
DIAGNOSTIC
An authentic customer-data-boundary packet passes; a tampered packet fails. v1 says the bundle is broken. v2 says where the evidence broke: source_index plus expected/actual hashes. Bounded claim only: this observed evidence trail stayed inside this declared boundary, and later mutation is detectable.
Details
bash gallery/customer-data-boundary-crash-test/verify.sh authentic # exit 0 bash gallery/customer-data-boundary-crash-test/verify.sh tampered # exit 1
AuthenticPASS
TamperedFAIL
Diagnosticsource_index + sha256
Install pip install assay-ai
Clone gallery git clone https://github.com/Haserjian/assay-proof-gallery
Verify online Browser verifier — client-side, nothing uploaded
Trust boundary: Assay proves the evidence artifact has not been quietly changed after the fact. It does not, by itself, prove every upstream component was honest. Stronger deployment patterns — CI-held signing keys, transparency logs, external timestamping — raise the cost of full fabrication. The trust boundary is tamper-evidence, not omniscience.