Circuit Definition for Diverse AI Tasks
The functionality of ZK wrappers hinges on the ability to define arithmetic circuits for diverse AI tasks, such as neural network inference, regression, and data preprocessing. These circuits translate computational tasks into a set of mathematical constraints that can be verified using ZKPs. Several approaches are employed to create these circuits:


Modular Circuits
These are reusable components designed for common operations like matrix multiplication, a fundamental computation in AI workloads. For two n × n matrices, the number of multiplication gates is G = n³, where G is the gate count and n is the matrix dimension. This cubic scaling impacts proof generation time, as larger matrices require more constraints to verify. Substrate's off-chain worker infrastructure provides an efficient environment for generating proofs for these computationally intensive circuits without affecting on-chain performance.

Pre-Built Templates
Optimized circuits are created for specific models, such as Multi-Layer Perceptrons (MLPs), to minimize the number of constraints. For an MLP with l layers and m neurons per layer, the gate count approximates G ≈ l × m², where l and m define the network's structure. These templates reduce the computational overhead of proof generation by providing pre-optimized circuits for common AI architectures. The templates are designed to integrate seamlessly with both EVM contracts and native Substrate pallets, ensuring maximum flexibility for developers.

Dynamic Generation
Tools like ZoKrates or Circom compile high-level code into circuits, adapting to custom tasks. The constraint count c reflects the circuit's complexity, directly affecting proof generation costs; optimizations like constraint folding reduce c by merging redundant operations, improving efficiency. These tools can be integrated with Substrate's development environment, allowing developers to generate circuits that leverage the framework's unified runtime capabilities.
For example, a circuit for a convolutional neural network (CNN) might include modular components for convolution operations, pre-built templates for activation functions (e.g., ReLU), and dynamically generated constraints for custom layers. This modular approach allows ZK wrappers to support a wide range of AI tasks while optimizing performance through Substrate's efficient execution environment.

