GM builders quick experiment: I fed a small ONNX model through JSTprove and let @inference_labs work its magic on my ARM phone. Model quantized down to a sub-100MB circuit, Expander produced a proof in seconds, and the output came with a cryptographic receipt that reveals nothing sensitive but proves the reasoning chain felt like giving AI a receipt for every decision
This is real edge-first zkML: 160M+ zk proofs in production, selective gates to keep costs sane, and latency that actually matters for wearables or financial agents. If you care about deployable trust rather than hype, start designing for verifiable inference and not just bigger models #zkML #AI #privacy $DGMA
Who else has tried JSTprove on mobile?
