AI systems are making decisions and taking actions. Sooner or later, someone asks: show me the proof.
Example proof packs generated by the real Assay toolchain. Locally verifiable. No account, no platform access, no trust in the vendor required.
That question — can you prove it? — comes from customers, auditors, regulators, compliance teams, security reviewers, and internal leadership.
Most teams answer with logs on their own servers, screenshots, dashboards, policy documents, and selectively presented evidence. All of it depends on trusting the vendor.
The core problem: there is no artifact a company can hand over that an outsider can independently verify. A company says "our AI controls ran" but cannot produce proof that someone else can check.
Assay is an evidence compiler. It records the important execution events, checks, and decisions during an AI workflow and packages them into a proof pack — a small, portable, cryptographically signed folder that anyone can verify offline.
Verify a real sample artifact in your browser. No install. Nothing uploaded.
Client-side verification. Reviewer packets and proof packs resolve to the same contract, just at different layers.
A proof pack is a small signed folder created from an AI workflow execution. It is the thing a buyer, auditor, or reviewer can actually hold, inspect, forward, and verify.
If a file was changed, the fingerprint won't match. If someone edits the manifest to cover it, the signature breaks. That is how tampering is detected.
Six public artifacts from the real toolchain: three proof-pack verdicts, one reviewer packet, one insurance vertical mapping, and one MCP proxy scenario.
pip install assay-ai
git clone https://github.com/Haserjian/assay-proof-gallery