A vibe coding platform that lets users generate code with AI agents and deploy to Vercel with one click.
This template demonstrates building an AI-powered code generation platform using:
- Orchestrator + Coding Agent architecture (Claude Agent SDK, OpenAI Codex) with smart routing
- Vercel Sandbox for secure code execution (Firecracker MicroVMs)
- Vercel SDK for deploying generated code to production
- Real-time streaming of AI responses and tool execution with smooth rendering
- AI Elements component library for chat UI (Tool calls, Messages, Conversation)
┌────────────────────────────────────────────────────────────────────┐
│ Platform Template │
├────────────────────────────────────────────────────────────────────┤
│ │
│ ┌──────────────┐ ┌───────────────────────────────────────────┐ │
│ │ Chat UI │ │ Orchestrator Agent │ │
│ │ (AI Elements│───▶│ (claude-sonnet-4-6 via Vercel AI SDK) │ │
│ │ components)│ │ Routes: answer directly OR call BuildApp │ │
│ └──────────────┘ └──────────────────┬────────────────────────┘ │
│ │ BuildApp tool call │
│ ▼ │
│ ┌────────────────────────────────────────┐ │
│ │ Agent Registry │ │
│ │ ┌────────────────┬──────────────────┐ │ │
│ │ │ Claude Agent │ Codex Agent │ │ │
│ │ │ (Agent SDK) │ (OpenAI Codex) │ │ │
│ │ └───────┬────────┴───────┬──────────┘ │ │
│ └──────────┼────────────────┼────────────┘ │
│ └───────┬────────┘ │
│ ▼ │
│ ┌────────────────────────────────────────┐ │
│ │ AI Gateway (VERCEL_OIDC_TOKEN) │ │
│ └────────────────────────────────────────┘ │
│ │ │
│ ▼ │
│ ┌────────────────────────────────────────┐ │
│ │ Shared MCP Sandbox Tools │ │
│ │ read_file │ write_file │ run_command │ │
│ └────────────────────────────────────────┘ │
│ │ │
│ ▼ │
│ ┌────────────────────────────────────────┐ │
│ │ @vercel/sandbox │ │
│ │ (Firecracker MicroVM) │ │
│ └────────────────────────────────────────┘ │
│ │ │
│ ▼ │
│ ┌────────────────────────────────────────┐ │
│ │ Deploy to Vercel │ │
│ │ @vercel/sdk │ │
│ └────────────────────────────────────────┘ │
└────────────────────────────────────────────────────────────────────┘
- Node.js 20+
- pnpm 9+
pnpm installCreate a .env.local file:
# AI Gateway (routes all LLM calls)
AI_GATEWAY_BASE_URL=https://ai-gateway.vercel.sh
VERCEL_OIDC_TOKEN= # For AI Gateway auth
# Vercel Deployments
VERCEL_PARTNER_TOKEN=
VERCEL_PARTNER_TEAM_ID=
# Vercel OAuth
VERCEL_CLIENT_ID=
VERCEL_CLIENT_SECRET=
# Redis
REDIS_URL= # Redis connection string
# Proxy URL
PROXY_BASE_URL=
# Session encryption
JWE_SECRET= # 256-bit base64 keypnpm devOpen http://localhost:3000 to see the app.
platform-template/
├── app/ # Next.js App Router
│ ├── api/
│ │ ├── chat/
│ │ │ ├── route.ts # Chat streaming endpoint (POST)
│ │ │ └── [id]/stream/ # Resumable stream endpoint (GET)
│ │ ├── ai/ # AI proxy and session management
│ │ ├── auth/ # Vercel OAuth routes
│ │ └── botid/ # Bot detection
│ ├── rpc/[[...rest]]/ # oRPC endpoint handler
│ ├── chat/[chatId]/page.tsx # Resume existing chat
│ ├── page.tsx # Home page (new chat)
│ └── layout.tsx # Root layout with providers
│
├── components/ # React components
│ ├── ai-elements/ # AI UI components (chat, tools, terminal)
│ │ ├── conversation.tsx # Conversation container with auto-scroll
│ │ ├── message.tsx # Message + streaming markdown renderer
│ │ ├── tool.tsx # Tool call display (input/output/status)
│ │ ├── code-block.tsx # Syntax-highlighted code blocks (shiki)
│ │ ├── terminal.tsx # Terminal output renderer
│ │ ├── file-tree.tsx # File tree display
│ │ └── web-preview.tsx # Inline web preview component
│ ├── chat/
│ │ └── chat.tsx # Main chat panel with streaming logic
│ ├── ui/ # Base UI components (shadcn/ui)
│ ├── main-layout.tsx # Main app layout
│ ├── preview.tsx # Live preview iframe
│ ├── agent-selector.tsx # Agent selection dropdown
│ ├── template-selector.tsx # Project template selector
│ ├── deploy-popover.tsx # Vercel deploy popover
│ └── workspace-panel.tsx # File explorer/workspace
│
├── lib/ # Core business logic
│ ├── agents/ # AI agent system
│ │ ├── types.ts # AgentProvider interface & StreamChunk types
│ │ ├── registry.ts # Agent registry (Claude default, Codex)
│ │ ├── claude-agent.ts # Claude Agent SDK implementation
│ │ ├── codex-agent.ts # OpenAI Codex implementation
│ │ ├── orchestrator-agent.ts # Routing layer (answer vs. BuildApp)
│ │ └── stream.ts # Stream utilities
│ ├── auth/ # Authentication (OAuth, JWT)
│ ├── hooks/ # React hooks
│ │ ├── use-persisted-chat.ts # Chat history persistence
│ │ └── use-sandbox-from-url.ts
│ ├── rpc/ # oRPC router & procedures
│ │ └── procedures/ # chat, sandbox, deploy, claim
│ ├── sandbox/ # Sandbox setup and management
│ ├── templates/ # Project templates (Next.js, Vite, TanStack Start)
│ └── store/ # Zustand state management
│
└── types/ # Global TypeScript types
| Category | Technology |
|---|---|
| Framework | Next.js 16 (App Router) |
| Runtime | React 19 |
| AI SDKs | Claude Agent SDK, Vercel AI SDK v6 |
| Sandbox | @vercel/sandbox (Firecracker MicroVMs) |
| Deployment | @vercel/sdk |
| RPC | oRPC (type-safe) |
| State | Zustand |
| Validation | Zod v4 |
| Styling | Tailwind CSS 4 |
| UI Components | Radix UI |
| Auth | Arctic (OAuth), Jose (JWT) |
| Persistence | Redis |
| Streaming | resumable-stream (connection recovery) |
| Markdown | streamdown (streaming markdown) |
Every message first goes through the orchestrator agent (claude-sonnet-4-6 via Vercel AI SDK), which decides whether to:
- Answer directly — for general questions, small talk, or anything that doesn't require code changes
- Call
BuildApp— to delegate to the selected coding agent (Claude Agent SDK or Codex) inside a Vercel Sandbox
This avoids spinning up a sandbox for simple conversational turns.
All coding agent SDKs implement a unified AgentProvider interface:
interface AgentProvider {
id: string;
name: string;
description: string;
logo: string;
execute(params: ExecuteParams): AsyncIterable<StreamChunk>;
}Agents yield StreamChunk events that are rendered in real-time:
message-start- New assistant message beginstext-delta- Incremental text (smoothed with 20ms delay)reasoning-delta- Chain-of-thought texttool-start- Tool execution beginningtool-input-delta- Streaming tool inputtool-result- Tool execution resultdata- Custom data parts (sandbox status, preview URL, file writes, etc.)error- Error with optional code
Streams are resumable — if the user refreshes mid-generation, the client reconnects via a GET endpoint (/api/chat/[id]/stream) backed by resumable-stream and Redis pub/sub. The active stream ID is persisted in the chat session and cleared on completion.
The chat UI is built with a set of composable AI Elements components:
<Conversation>/<ConversationContent>— scrollable message container with auto-scroll-to-bottom<Message>/<MessageResponse>— user and assistant message rendering with streaming markdown<Tool>/<ToolHeader>/<ToolInput>/<ToolOutput>— collapsible tool call display with status badges
All AI-generated code runs in isolated Firecracker MicroVMs via @vercel/sandbox. Templates define setup commands per framework. The preview URL is yielded immediately so the iframe loads while the agent works.
Single router definition shared by server and client with full TypeScript inference.
pnpm dev # Start development server
pnpm build # Build for production
pnpm start # Start production server
pnpm lint # Run ESLint
pnpm format # Format with Prettier
pnpm test # Run tests
pnpm test:watch # Run tests in watch mode- Vercel AI SDK - AI SDK documentation
- Vercel Sandbox - Sandbox documentation
- oRPC - Type-safe RPC framework