The Provocation
The interface—the screen you see, the buttons you click, the colors you perceive—is becoming ephemeral.
It's no longer the primary output of design work. Instead, design is shifting upstream: toward infrastructure, systems, semantics, and constraints. Interfaces will be generated in real-time by AI agents based on user context, brand parameters, and design tokens. The "designless" future doesn't mean design disappears; it means design becomes invisible—embedded in the systems that birth experiences.
This is uncomfortable for designers who've spent their careers perfecting pixels. But it's inevitable.
Evidence: Generative UI Is Already Here
- Real implementations:
- Google Gemini 3: Generates completely custom interfaces in real-time based on queries, rendering full HTML/CSS/JavaScript
- Vercel v0: Now agentic; can research, reason, debug, take projects end-to-end
- Figma's AI UI Generator: Prompt to UI in seconds, fully editable
- Google A2UI: Open standard enabling AI agents to compose dynamic UIs from trusted component catalogs
The insight: The interface is no longer designed; it's specified and generated. This is fundamentally different work.
Design Systems as Expression Infrastructure
Design systems aren't libraries anymore. They're evolving into expression infrastructure: the machine-readable, semantic foundation that enables AI to make intelligent design decisions.
- Key signals:
- Figma's MCP (Model Context Protocol) enables AI agents to access design context, tokens, components, documentation directly
- Design tokens are restructured with semantic layers that carry intent (not just
blue-5butcolor-feedback-error) - Tools like Supernova.io now serve "structured data and component references directly into AI tools"
- The transformation:
- Old: Design System = Static library of components
- New: Design System = AI-consumable expression infrastructure that governs how interfaces emerge
From Designing States to Designing Rules
Before: Designers design every screen state (normal, hover, loading, error, success)
After: Designers define the rule: "On error, show red feedback at the top with a clear action to resolve." AI generates the variations.
Why It Matters: Infinite states exist, but they're all governed by a coherent rule set. This is how you scale design without scaling headcount.
Designing for Dual Audiences: Humans AND Agents
Interfaces are no longer designed for humans alone. They must work for two audiences: humans and AI agents.
- Evidence:
- AG-UI Protocol: Bridge that turns complex agentic operations into experiences humans understand
- Microsoft Design's Agentic UX: Design interfaces where agents can observe environment, process information, decide actions, learn over time
- Gartner prediction: By 2028, 33% of enterprise software will incorporate agentic AI
Designing for agents means designing for legibility—machine-readable information structures, clear decision paths, visible reasoning.
What Designers Must Do NOW
- 1. Adopt semantic thinking: Name design decisions with intent, not appearance
- ❌
color-blue-5→ ✅color-feedback-error - ❌
space-4→ ✅spacing-component-button-padding
2. Become constraint designers: Stop designing every state; start defining rules about which choices are valid
3. Learn design as code: Tokens, constraints, and design systems expressed as structured data (JSON, YAML), version-controlled like code
4. Understand agentic interaction patterns: Study AG-UI, A2UI, Microsoft's frameworks for designing where agents and humans collaborate
5. Build design infrastructure, not components: Design systems as APIs, governance as executable rules, tokens as semantic models
6. Study design governance: What rules govern good AI-generated output? How do you maintain brand coherence at scale?
The Evolution Arc
- Stage 1 (2010–2023): Design Systems
- Purpose: Shared language, consistency, reusability
- Artifact: Component library + design tokens
- Success: Component adoption
- Stage 2 (2024–2025): AI-Native Design Systems
- Purpose: Machine-readable design context for AI
- Artifact: Components + semantic tokens + design APIs + metadata
- Success: AI-informed code generation accuracy
- Stage 3 (2026+): Expression Infrastructure
- Purpose: Runtime generation of interfaces aligned to brand and context
- Artifact: Constraint specifications, semantic models, token hierarchies, behavior rules
- Success: Brand coherence in generated UIs
Shift Cards Referenced
- Shift 11: Component Library as Reference → Design System as Operating System
- Shift 12: Designing Specific States → Designing Generative Rules
- Shift 18: Design for Humans Only → Design for Humans AND Agents
- Shift 19: Static Design → Generative UI