Skip to main content

AI-Generated Prototypes

5 min read
Ux Eng

Ux Eng

AI gets you 80% there. You own the last 20% — accessibility, coherence, edge cases.

AI-Generated Prototypes

TL;DR

  • AI can turn sketches, wireframes, or text descriptions into clickable prototypes. Fast. Not perfect.
  • Research: ~70% design-system accuracy in one pass (e.g., Atlassian). Handoffs → handshakes—wireframes, specs, and code generated together. Align Figma and code for better AI output.
  • Use AI for rapid iteration. You verify layout, interaction, accessibility. The bridge between design and code is your job.

Design-to-code tools have been around for years. AI makes them better — and more dangerous. You can go from "a rough sketch" to "something that looks like an app" in minutes. But "looks like" isn't "works like." Your job as a UX engineer is to close that gap.

What AI Does Well

Layout and structure:

  • Given a sketch or description, AI can generate a reasonable component structure. Flexbox, grid, basic responsiveness.
  • Good for first pass. You refine.

Styling:

  • Colors, typography, spacing — AI can follow a design system if you give it tokens. It will also invent things. Verify.
  • Use design tokens. Constrain the output.

Boilerplate interactions:

  • Buttons, forms, basic navigation. AI knows common patterns. It'll produce something usable.
  • Edge cases (loading states, errors, empty states) — AI often forgets. You add them.

What AI Misses

Accessibility:

  • ARIA labels, focus order, keyboard nav, contrast — AI will sometimes get these. Often it won't.
  • Always run an a11y check. Add what's missing. This is non-negotiable.

Design system coherence:

  • AI might mix styles, use wrong components, or ignore your spacing scale. One-off prototypes are fine; production-ready needs your eye.
  • Treat AI output as a draft. Apply design system rules in your edit pass.

Interaction details:

  • Hover states, transitions, micro-interactions — AI can add some. It won't match your design system's motion language.
  • You define and enforce consistency.

Edge cases and states:

  • Empty, loading, error, success. AI tends toward the happy path. Your prototypes need the full set.
  • Add them. Document them. Make them part of the handoff.

The Workflow

  1. Generate — Sketch or describe. Get AI output. Don't expect perfection.
  2. Audit — Layout, spacing, components. Does it match the design system?
  3. Accessibility pass — Semantics, keyboard, screen reader. Fix what's broken.
  4. Add states — Loading, error, empty. AI won't. You do.
  5. Handoff — Clean, documented, ready for dev. Your name on it. Your standards.

When to Use AI, When to Build

Use AI for:

  • Early exploration, "what if we tried X?"
  • Speeding up first drafts for stakeholder review
  • Learning a new pattern or component structure

Build by hand when:

  • It's going to production
  • Accessibility is critical (and it usually is)
  • Design system compliance matters
  • The design is complex or novel

AI Disruption Risk for UX Engineers

Moderate Risk

SafeCritical

AI generates ~70% design-system-accurate prototypes in one pass. Accessibility, edge states, and design system coherence stay human. Moderate risk for those who skip the audit pass before handoff.

Sketch → manual prototype. Design → handoff doc → code.

Click "With AI" to see the difference →

Quick Check

What must you always add to AI-generated prototypes before handoff?

Do This Next

  1. Run one experiment — Sketch a simple screen. Generate with AI. List everything you had to fix. That's your QA checklist.
  2. Create an accessibility checklist — Run every AI-generated prototype through it. Make it a habit.
  3. Define "production-ready" — What must be true before you hand off? Document it. Apply it to AI output before it reaches dev.