Documentation Index
Fetch the complete documentation index at: https://docs.supermodel.network/llms.txt
Use this file to discover all available pages before exploring further.
Hello World: Calculator App
This example walks through creating a simple calculator app using SuperModel’s zero-inference architecture. You’ll see how routing and UI generation work via MCP sampling.User Request
Input: “Show me a calculator” This simple request triggers SuperModel’s layered architecture to route the request and generate an interactive calculator interface.Step 1: Request Routing via Sampling
SuperModel’s gateway doesn’t decide which tool to use - it asks the client’s LLM:Zero Inference Cost: The gateway server made no LLM API calls. The client’s LLM handled the routing decision.
Step 2: Tool Selection & Execution
Based on the routing response, SuperModel selects thecalculator-ui tool and executes it with the provided parameters.
Step 3: UI Generation via Sampling
The calculator tool uses MCP sampling to generate the actual UI component:Zero Inference Cost: Again, the calculator tool made no LLM API calls. The client’s LLM generated the entire UI component.
Step 4: Resource Packaging
The calculator tool packages the generated component as an MCP-UI resource:Step 5: Final Response
SuperModel returns the complete response to the client:What Just Happened?
Zero Server Inference
The server made zero LLM API calls. All reasoning (routing + generation) happened on the client via MCP sampling.
Intelligent Routing
Client’s LLM correctly identified that “Show me a calculator” should route to the calculator-ui tool.
Dynamic UI Generation
Client’s LLM generated a complete, functional React component with proper AG-UI event handling.
AG-UI Event Handling
The generated calculator includes proper AG-UI event handling:User Input Events
User Input Events
Tool Call Events
Tool Call Events
Calculation Events
Calculation Events
Cost Analysis
Traditional Approach:- Routing decision: $0.02
- UI generation: $0.08
- Total: $0.10 per calculator request
- Server costs: $0.00
- Client handles all LLM work
- Total: $0.00 per calculator request
Try It Yourself
Want to implement this example? Here’s the complete setup:Quick Start Guide
Follow our step-by-step guide to set up SuperModel and create this calculator.
Next Examples
Multi-App Workflow
See how context flows between multiple specialized UI apps.
Context Handoff
Learn how apps share context for seamless user experiences.