The Last Mile from AI to Human
AI-to-UI Intermediate Representation (IR) Protocol Layer
Human Intent → AI Understanding → AI Generation → ??? → Human Perception
↑
The gap is here
AI can understand human intent, reason, call tools, and generate content. But how does AI output actually reach the human? This "last mile" has been broken.
UIX bridges this gap — a protocol that both AI and UI understand.
UIX is an Intermediate Representation (IR) protocol layer between AI and UI.
┌─────────────────────────────────────────────────────────────┐
│ UIX │
│ "The Last Mile from AI to Human" │
├─────────────────────────────────────────────────────────────┤
│ │
│ Human Intent │
│ ↓ │
│ AI Processing (thinking, tool calls, generation) │
│ ↓ │
│ UIX IR ← Standardized format AI outputs │
│ ↓ │
│ UI Rendering ← Components that understand IR │
│ ↓ │
│ Human Perception │
│ │
└─────────────────────────────────────────────────────────────┘
UIX IR is consumed by both AI and UI.
- For AI: A standardized output format — AI generates UIX IR directly
- For UI: A standardized input format — UI renders UIX IR directly
- Result: AI speaks this format, UI understands this format — no translation needed
| Protocol | Status | Issue |
|---|---|---|
| A2UI (Google) | v0.8+ | Declarative UI payload, multi-platform (React/Flutter/SwiftUI) |
| AG-UI (CopilotKit) | v0.1+ | Event-based agent-to-frontend protocol, adopted by Google/Microsoft/AWS |
| MCP Apps (Anthropic) | SEP-1865 Draft | Still in design, not usable |
Multiple protocols are emerging, but none provides a unified IR layer that bridges them all.
UIX provides:
- UIX IR - A stable internal protocol that works today
- Adapters - Compatibility with Vercel AI SDK, AG-UI, A2UI, and future protocols
- Reference implementation - React renderer as default
AI Agent Events (Vercel AI SDK / AG-UI / AgentX / ...)
↓
UIX IR (stable internal protocol)
↓
├── ReactRenderer (works today)
├── A2UIRenderer (when A2UI matures)
└── MCPAppsRenderer (when MCP Apps matures)
"Design Tokens let developers stop redefining colors. UIX IR lets AI stop relearning how to describe UI structure."
UIX IR = Script (what to perform)
React Components = Actors (how to perform)
Design Tokens = Costumes & Props (what to wear)
| Dimension | Design Tokens | UIX IR |
|---|---|---|
| Problem Solved | Design-to-code consistency | AI-output-to-UI standardization |
| Consumer | Human developers (understands CSS) | AI (needs structured, semantic description) |
| Target Market | Design systems, component libraries | AI Agent platforms |
| Competitors | Style Dictionary, Theo | AG-UI, A2UI (protocol layer, not unified IR) |
Traditional UI pipeline:
Designer (Figma) → Developer writes code → User sees UI
↑ Design Tokens solve this
AI Agent pipeline:
AI reasoning → UIX IR → Renderer → User sees UI
↑ UIX IR solves this (no one did before)
Design Tokens is "style variables". UIX IR is "AI's UI expression language" — completely different layers and purposes.
┌─────────────────────────────────────────────────────────────┐
│ Layer 1: UIX IR (Core) │
│ - JSON Schema definition │
│ - Block & Conversation standards │
│ - AI-generatable format │
└─────────────────────────────────────────────────────────────┘
↓
┌─────────────────────────────────────────────────────────────┐
│ Layer 2: Renderers │
│ - ReactRenderer → Web │
│ - A2UIRenderer → Native (future) │
│ - MCPAppsRenderer → Claude Desktop (future) │
└─────────────────────────────────────────────────────────────┘
↓
┌─────────────────────────────────────────────────────────────┐
│ Layer 3: Design System │
│ - @uix-ai/tokens (design tokens) │
│ - @uix-ai/react (base components) │
│ - @uix-ai/stream (streaming renderer) │
└─────────────────────────────────────────────────────────────┘
All implementations depend on the UIX IR abstraction:
┌─────────────────────┐
│ UIX IR │ ← Abstract protocol
│ (JSON Schema) │
└──────────┬──────────┘
│
┌─────────┬───┴───┬─────────┐
│ │ │ │
▼ ▼ ▼ ▼
┌─────────┐ ┌──────┐ ┌──────┐ ┌──────┐
│ Vercel │ │AG-UI │ │A2UI │ │AgentX│
│ AI SDK │ │Adapt.│ │Adapt.│ │ UI │
└─────────┘ └──────┘ └──────┘ └──────┘
interface LucidConversation {
id: string
role: 'user' | 'assistant' | 'system'
status: 'streaming' | 'completed' | 'error'
blocks: LucidBlock[]
timestamp: number
}
interface LucidBlock {
id: string
type: 'text' | 'tool' | 'thinking' | 'image' | 'file' | 'error'
status: 'streaming' | 'completed' | 'error'
content: unknown // varies by type
}
// Renderer interface
interface LucidRenderer<T> {
render(conversations: LucidConversation[]): T
}| Type | Description | Content |
|---|---|---|
text |
Text content (supports streaming) | { text: string } |
tool |
Tool/function call result | { name, input, output, status } |
thinking |
AI reasoning process | { reasoning: string } |
image |
Image content | { url, alt, width, height } |
file |
File attachment | { name, type, url } |
error |
Error message | { code, message } |
UIX is abstracted from AgentX practices:
AgentX UI (rough implementation, experimental)
↓ abstract & refine
UIX IR (protocol specification)
↓ implement
AgentX UI + Other frameworks (follow the spec)
AgentX 4-Layer Events
│
│ Stream: text_delta, tool_use_start
│ State: conversation_thinking, tool_executing
│ Message: assistant_message, tool_result_message
│
↓ Transform
UIX IR (LucidConversation[])
↓ Render
React Components
| Package | Layer | Status | Description |
|---|---|---|---|
@uix-ai/core |
Protocol | 🚧 Designing | UIX IR JSON Schema & TypeScript types |
@uix-ai/tokens |
Design System | ✅ Ready | Design tokens (colors, typography, spacing) |
@uix-ai/react |
Renderer | ✅ Ready | React renderer & base components |
@uix-ai/stream |
Renderer | ✅ Ready | Streaming markdown renderer (Streamdown) |
@uix-ai/agent |
Components | ✅ Ready | AI Agent conversation components |
@uix-ai/adapter-vercel |
Adapter | ✅ Ready | Vercel AI SDK 4.x / 6.x ↔ UIX IR converter |
@uix-ai/adapter-agui |
Adapter | 🚧 Alpha | AG-UI protocol events → UIX IR converter |
@uix-ai/adapter-a2ui |
Adapter | 🧪 Experimental | Google A2UI declarative UI → UIX IR converter |
UIX provides skills (AI-readable design rules) that work with any AI coding tool. No npm install needed — just copy a Markdown file.
| Skill | For | Description |
|---|---|---|
lucid-ui |
Everyone | Professional design system that replaces AI purple cliché |
uix-components |
UIX users | Component API, adapter patterns, IR type reference |
Supported platforms: Claude Code, Claude.ai, Cursor, OpenAI Codex, Windsurf, GitHub Copilot, Cline, Gemini Code Assist
See
skills/README.mdfor installation instructions.
pnpm add @uix-ai/reactimport { Button } from '@uix-ai/react'
function App() {
return <Button>Click me</Button>
}{
"conversations": [
{
"id": "conv-1",
"role": "user",
"status": "completed",
"blocks": [
{ "id": "b1", "type": "text", "status": "completed", "content": { "text": "Hello" } }
]
},
{
"id": "conv-2",
"role": "assistant",
"status": "streaming",
"blocks": [
{ "id": "b2", "type": "text", "status": "streaming", "content": { "text": "Hi there..." } },
{ "id": "b3", "type": "tool", "status": "completed", "content": { "name": "search", "output": "..." } }
]
}
]
}Rational Theme - Tech Blue #0284c7
- For: Data analysis, Technical products, Productivity tools
- Represents: Efficiency, Precision, Computation
Sentient Theme - Wisdom Gold #f59e0b
- For: Creative tools, Human-centric products, Thinking aids
- Represents: Wisdom, Thinking, Humanity
- White Foundation - Clear visual base, no dark mode
- No AI Purple - Reject overused AI clichés
- Block-Based - Parallel rendering of text + tools
- Streaming-First - Self-healing incomplete content
- Accessibility by Default - Not an afterthought
- Design token system (
@uix-ai/tokens) - React base components (
@uix-ai/react) - Streaming markdown renderer (
@uix-ai/stream) - AI Agent components (
@uix-ai/agent)- ChatMessage, ChatInput, Avatar system
- StreamText, ThinkingIndicator, ToolResult
- ChatList, ChatWindow layout components
- UIX IR JSON Schema (
packages/core/schema/uix-ir.schema.json) - TypeScript type definitions (
@uix-ai/core) - Vercel AI SDK adapter (
@uix-ai/adapter-vercel, supports SDK 4.x & 6.x) - AG-UI protocol adapter (
@uix-ai/adapter-agui) - A2UI protocol adapter (
@uix-ai/adapter-a2ui, experimental) - Documentation & examples (
examples/, per-package READMEs) - AgentX adapter
- IR validation tools
- npm publish (
@uix-ai/*packages) -
create-uix-appCLI scaffolding - Live demo with real AI agent
- MCP Apps renderer (when mature)
- More adapter integrations (LangChain, CrewAI, etc.)
"Protocols are multiplying — AG-UI, A2UI, MCP Apps, Vercel AI SDK — each with its own message format. UIX IR is the unified abstraction layer that adapts to all of them."
| Approach | Risk |
|---|---|
| Wait for one standard to win | Product stalls, bet on wrong horse |
| Bind to AG-UI directly | Locked into one protocol's event model |
| Bind to A2UI directly | A2UI changes, major rewrite needed |
| Bind to MCP Apps directly | Still in draft, may pivot |
| UIX IR + Adapters | Internal stability, external flexibility |
Part of the Deepractice AI development ecosystem:
| Project | Description |
|---|---|
| AgentX | AI Agent development framework |
| PromptX | Prompt engineering platform |
| PromptML | Deepractice Prompt Markup Language |
git clone https://github.com/Deepractice/UIX.git
cd UIX
pnpm install
pnpm devMIT - see LICENSE