top of page

How I Vibe-Coded a Pixel-Perfect AI Prototype as a Designer

  • Writer: Manuel S. Escobedo
    Manuel S. Escobedo
  • 1 day ago
  • 3 min read

I'm a designer. Not a developer. And I want to be clear about that before anything else — because this article isn't about crossing that line. It's about what happens when you bring your design judgment to a new set of tools.

Vibe coding doesn't replace engineers — it makes your handoff better

The conversation around vibe coding can feel threatening if you're on an engineering team. It shouldn't.

AI can help a designer build a functional prototype in an afternoon. What it cannot do is replace the hard disciplines of production engineering — performance, security, scalability, accessibility at scale. That work still belongs to skilled engineers, and it always will.

What changes is what reaches them. When a designer can validate interaction decisions with a working prototype before a sprint starts, engineers receive clarity instead of open questions. That's better collaboration, not competition.

Design first. Always.

Here's my honest take: most vibe-coded interfaces look generic because they skip the design thinking. The AI doesn't know what emotion you want the user to feel, or why the border should shift color between states, or why a card should clip rather than wrap.

You do. That's the irreplaceable part.

Spend time in Figma first. Work through your interaction states, your visual logic, your motion intent. The clearer your design decisions, the more precise your prompts — and the better your output. Vibe coding amplifies your thinking. It doesn't substitute for it.

My workflow in 3 steps

1 → Figma first Design your full interaction model before touching any code tool. Map every state. Know what you're building.

2 → Vibe code in the terminal I use Claude Code in the terminal (not a browser chat window) for better control and precision. Build in layers — one component at a time, test after each addition, and use specific design-language prompts.

"The input border should animate to violet on focus, not snap. The card title should populate the input field in place — not navigate to a new page."

Specific prompts get specific results.

3 → Deploy on Vercel Push to Vercel and get a live URL in minutes. No sign-in required. Works on any device. Share it with users, stakeholders, or hiring managers — it's a real prototype, not a Figma link.

Quick setup guide

Figma MCP — connect Claude to your designs

  1. Install Claude for Desktop

  2. Go to Settings → Developer → Edit Config

  3. Add this to your config file:

{
  "mcpServers": {
    "figma": {
      "command": "npx",
      "args": ["-y", "figma-developer-mcp", "--figma-api-key=YOUR_KEY_HERE"]
    }
  }
}
  1. Get your API key at figma.com → Account Settings → Personal Access Tokens

  2. Paste it in, save, restart Claude for Desktop — you're connected

Claude Code — vibe code in the terminal

npm install -g @anthropic-ai/claude-code
mkdir my-prototype && cd my-prototype
claude

Vercel — make it live

npm install -g vercel
vercel

Follow the prompts. 60 seconds later you have a shareable URL.

The honest takeaway

Vibe coding gave me a more honest design process. When something is actually running in a browser, the design questions become real — and you find the answers before anyone else has to spend time on them.

That's the skill worth building. Not to become a developer. To become a designer who can close the loop independently.

Want to see this in practice? I built a full interactive prototype using exactly this workflow.


 
 
 

Comments


bottom of page