Deploy to Vercel
Vercel provides serverless deployment with automatic scaling, global CDN, and zero-config deployments for many frameworks. This guide shows how to deploy a Pancake app to Vercel.
Prerequisites
- A Vercel account (free tier works)
- Your Pancake app ready to deploy
- Node.js 18+ installed locally
Quick Deploy
Install Vercel CLI
pnpm add -D vercelCreate API Route
Vercel uses file-based routing for serverless functions. Create api/index.ts:
import { createApp, defineView, defineAction, defineData } from '@pancake-apps/server';
import { z } from 'zod';
const app = createApp({
name: 'my-app',
version: '1.0.0',
views: {
hello: defineView({
description: 'A greeting view',
input: z.object({ name: z.string().optional() }),
data: z.object({ greeting: z.string() }),
handler: async ({ name }) => ({
greeting: `Hello ${name || 'World'}!`,
}),
ui: { html: './public/views/hello.html' },
}),
},
actions: {
// Your actions
},
data: {
// Your data endpoints
},
});
export default async function handler(req: Request): Promise<Response> {
return app.handleRequest(req);
}
export const config = {
runtime: 'edge', // or 'nodejs' for Node.js runtime
};Configure vercel.json
Create vercel.json in your project root:
{
"version": 2,
"buildCommand": "pnpm build",
"outputDirectory": "dist",
"rewrites": [
{ "source": "/mcp", "destination": "/api" },
{ "source": "/mcp/(.*)", "destination": "/api" },
{ "source": "/.well-known/(.*)", "destination": "/api" },
{ "source": "/openapi.json", "destination": "/api" }
]
}This routes MCP and OpenAPI requests to your serverless function.
Deploy
npx vercelFollow the prompts. Your app is deployed to a .vercel.app URL.
For production:
npx vercel --prodProject Structure
For Vercel, restructure your project:
my-app/
├── api/
│ └── index.ts # Serverless function
├── public/
│ └── views/ # Static HTML views
│ └── hello.html
├── src/
│ └── views/ # React source (if using React)
│ └── hello/
│ └── index.tsx
├── vercel.json
├── package.json
└── tsconfig.jsonUsing the Node.js Runtime
For full Node.js compatibility (at the cost of cold start time), use the Node.js runtime:
// api/index.ts
export const config = {
runtime: 'nodejs',
maxDuration: 30, // seconds
};Environment Variables
Via CLI
vercel env add DATABASE_URL
vercel env add API_KEYVia Dashboard
- Go to your project in the Vercel dashboard
- Click Settings then Environment Variables
- Add your variables for Production, Preview, and Development
In Code
Access environment variables normally:
const apiKey = process.env.API_KEY;Serving View Assets
Option 1: Public Directory
Put static HTML views in public/views/:
ui: { html: './public/views/hello.html' }Vercel serves public/ at the root.
Option 2: Build to Public
Update your build script to output views to public/:
{
"scripts": {
"build": "vite build --outDir public/views"
}
}CI/CD with GitHub
Connect your GitHub repository to Vercel for automatic deployments:
- Go to vercel.com/new
- Import your repository
- Configure build settings (Vercel auto-detects most)
- Deploy
Every push to main triggers a production deploy. Pull requests get preview deployments.
GitHub Actions (Alternative)
name: Deploy to Vercel
on:
push:
branches: [main]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: pnpm/action-setup@v4
with:
version: 9
- uses: actions/setup-node@v4
with:
node-version: 22
cache: 'pnpm'
- run: pnpm install
- run: pnpm build
- name: Deploy to Vercel
uses: amondnet/vercel-action@v25
with:
vercel-token: ${{ secrets.VERCEL_TOKEN }}
vercel-org-id: ${{ secrets.VERCEL_ORG_ID }}
vercel-project-id: ${{ secrets.VERCEL_PROJECT_ID }}
vercel-args: '--prod'Get your tokens from the Vercel dashboard under Settings then Tokens.
Custom Domain
- Go to your project in the Vercel dashboard
- Click Settings then Domains
- Add your domain
- Update DNS records as instructed
Vercel provisions SSL certificates automatically.
Connecting AI Clients
Claude Desktop
{
"mcpServers": {
"my-app": {
"url": "https://my-app.vercel.app/mcp"
}
}
}ChatGPT
Use your Vercel URL for the plugin manifest:
https://my-app.vercel.app/.well-known/openai-plugin.jsonLimitations
Vercel serverless functions have some constraints:
| Limit | Hobby | Pro |
|---|---|---|
| Execution time | 10 seconds | 60 seconds |
| Memory | 1024 MB | 3008 MB |
| Payload size | 4.5 MB | 4.5 MB |
For long-running operations, consider:
- Breaking into smaller functions
- Using background jobs (Vercel Cron, external queue)
- Moving to a traditional server
Troubleshooting
Function Timeout
If your function times out:
- Optimize slow database queries
- Add caching for expensive operations
- Increase timeout in
vercel.json(Pro plan)
Build Errors
Check the build logs in the Vercel dashboard. Common issues:
- Missing dependencies (add to
dependencies, notdevDependencies) - TypeScript errors (run
pnpm typechecklocally first) - Environment variables not set
Cold Starts
Serverless functions have cold starts. To minimize:
- Keep dependencies minimal
- Use the Edge runtime when possible
- Consider Vercel's Fluid Compute for Pro plans
Next Steps
- Deploy to Cloudflare: Alternative deployment option
- Troubleshooting: Common issues and fixes
- API Reference: Complete API documentation