Minimal FastAPI app that runs the OpenAI Agents SDK with Vercel Sandbox on Vercel's Python runtime. Each request spins up an isolated microVM, gives the agent shell access to analyze data, and tears it down when done.
- Python 3.12+ (Vercel default is 3.12).
- A Vercel account and Vercel CLI (
npm i -g vercel). - An OpenAI or Vercel AI Gateway API key.
| Variable | Required | Description |
|---|---|---|
OPENAI_API_KEY |
Yes | OpenAI or Vercel AI Gateway API key. |
VERCEL_TOKEN |
Yes | Vercel access token. Create one at https://vercel.com/account/tokens. |
VERCEL_TEAM_ID |
Yes | Your Vercel team ID (starts with team_). Found under Team Settings > General. |
VERCEL_PROJECT_ID |
Yes | Your Vercel project ID (starts with prj_). Found under Project Settings > General. |
OPENAI_DEFAULT_MODEL |
No | Default model when the request body omits model. Falls back to gpt-4.1-mini. |
OPENAI_BASE_URL |
No | When using AI Gateway, set this to https://ai-gateway.vercel.sh/v1 |
Copy the example env file and fill in your values:
cp .env.example .env.localThen edit .env.local with your keys:
OPENAI_API_KEY=sk-...
VERCEL_TOKEN=your_access_token
VERCEL_TEAM_ID=team_xxx
VERCEL_PROJECT_ID=prj_xxx
uv sync
uv run uvicorn app:app --reload --host 127.0.0.1 --port 8000Open http://127.0.0.1:8000 to use the interactive demo. The agent has shell access to a sandbox with sample sales data (sales.csv).
API endpoints:
GET /api/healthreturns{"status": "ok", "openai_configured": true}POST /api/runwith{"input": "Which region grew the most?"}runs the sandbox agent
vercelVercel detects app.py and the app ASGI instance. Dependencies come from pyproject.toml.
After deploying, make sure OPENAI_API_KEY, VERCEL_TOKEN, VERCEL_TEAM_ID, and VERCEL_PROJECT_ID are set under Project Settings > Environment Variables.
Sandbox creation and agent runs can take several seconds. Heavy workloads may need Fluid Compute or Vercel Workflow for durable steps.
- Each
POST /api/runcreates a fresh Vercel Sandbox microVM with sample data. - A
SandboxAgentwithShellcapability receives the user's prompt. - The agent writes and runs shell commands inside the sandbox to answer the question.
- The sandbox is torn down after the response is returned.
If you want to use the Vercel AI Gateway with the OpenAI Agents SDK, there are three simple changes that are needed.
- Create a new AI Gateway API key, and set it as your
OPENAI_API_KEYenvironment variable under Project Settings > Environment Variables. - Set the
OPENAI_BASE_URLenvironment variable tohttps://ai-gateway.vercel.sh/v1under Project Settings > Environment Variables. - Ensure that the provider prefix on your model ID is not being stripped by setting the
model_provider=MultiProvider(openai_prefix_mode="model_id")in yourRunConfigas shown in the example.
MIT (match your org's policy when publishing).