2. The RouterLink Ecosystem: Demand · Supply · Routing
RouterLink decomposes the AI ecosystem into three explicit roles that interact through protocol rules rather than organizational trust.

Demand — Users and Developers
Users interact through a single unified gateway, abstracting away provider-specific APIs, authentication schemes, and billing systems. Users can express preferences such as:
Cost sensitivity versus latency requirements
Fallback behavior during outages
Geographic or regulatory constraints
Users pay for usage in $WAI while retaining transparency into how requests are routed.
Supply — Service Providers
Service providers register AI capabilities with the protocol and publish pricing weights. Revenue is earned based on actual demand, measured through PoTU receipts, rather than fixed contracts.
This includes:
Model providers (OpenAI, Anthropic, Google, etc.)
Cloud infrastructure operators
Specialized inference services
Capacity holders with idle compute
Routing — Node Operators (Router + Validator)
Node operators are responsible for:
Forwarding AI requests to appropriate services
Observing execution outcomes
Participating in performance validation through PoLP
Crucially, RouterLink couples routing and validation into a single role. Nodes that influence traffic are also accountable for observing and validating results.
Economic Alignment
Users
Pay for AI inference
Access to multi-provider routing
Providers
Supply AI capacity
Revenue from verified usage
Nodes
Route & validate
Fees + performance bonuses
Delegators
Stake capital
Share of node rewards
All rewards are derived from protocol-generated proofs (PoTU/PoLP), not discretionary allocation.
Last updated