🥞PancakeJS

Deploy to Cloudflare

Cloudflare offers two ways to deploy Pancake apps:

  1. Cloudflare Tunnel: Route traffic to a server running anywhere (your machine, VPS, cloud VM)
  2. Cloudflare Workers: Deploy as an edge function (serverless)

This guide covers both approaches.

Option 1: Cloudflare Tunnel

Cloudflare Tunnel creates a secure connection from your server to Cloudflare's network. Your server can run anywhere, and you get a public URL with automatic HTTPS.

This is the recommended approach if:

  • You want to run the full Node.js server
  • You have an existing VPS or cloud VM
  • You want to use all Node.js features without limitations

Install cloudflared

brew install cloudflared
# Debian/Ubuntu
curl -L https://github.com/cloudflare/cloudflared/releases/latest/download/cloudflared-linux-amd64.deb -o cloudflared.deb
sudo dpkg -i cloudflared.deb

# Other distributions
# Download from: https://developers.cloudflare.com/cloudflare-one/connections/connect-apps/install-and-setup/installation/
winget install Cloudflare.cloudflared

Build Your App

pnpm build

Start Your Server

NODE_ENV=production node dist/index.js

Your server is now running locally on port 3000.

Create a Tunnel (Quick Tunnel)

For quick testing or temporary deployments:

cloudflared tunnel --url http://localhost:3000

This creates a random *.trycloudflare.com URL. No Cloudflare account needed.

Quick Tunnel URLs change every time you restart. For permanent URLs, create a named tunnel.

Create a Named Tunnel (Production)

For production, create a permanent tunnel with a stable URL.

Login to Cloudflare:

cloudflared tunnel login

Create a tunnel:

cloudflared tunnel create my-pancake-app

Create a config file at ~/.cloudflared/config.yml:

tunnel: my-pancake-app
credentials-file: /path/to/my-pancake-app.json

ingress:
  - hostname: myapp.example.com
    service: http://localhost:3000
  - service: http_status:404

Route the tunnel to your domain:

cloudflared tunnel route dns my-pancake-app myapp.example.com

Run the tunnel:

cloudflared tunnel run my-pancake-app

Run as a Service

To keep the tunnel running permanently:

# Install the service
sudo cloudflared service install

# Start the service
sudo systemctl start cloudflared
sudo systemctl enable cloudflared
# Install the service
sudo cloudflared service install

# Start the service
sudo launchctl load /Library/LaunchDaemons/com.cloudflare.cloudflared.plist

Option 2: Cloudflare Workers

Deploy your Pancake app as a Cloudflare Worker for edge deployment with global distribution.

Workers have some limitations: 128MB memory, 30-second CPU time, and not all Node.js APIs are available. For most Pancake apps, this is fine.

Install Wrangler

pnpm add -D wrangler

Create wrangler.toml

name = "my-pancake-app"
main = "dist/index.js"
compatibility_date = "2025-12-01"
compatibility_flags = ["nodejs_compat"]

[build]
command = "pnpm build"

# Optional: Environment variables
[vars]
NODE_ENV = "production"

Adapt Your Server for Workers

Cloudflare Workers use a different request handler. Create src/worker.ts:

import { createApp, defineView } from '@pancake-apps/server';
import { z } from 'zod';

const app = createApp({
  name: 'my-app',
  version: '1.0.0',

  views: {
    // Your views here
  },
});

export default {
  async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
    return app.handleRequest(request);
  },
};

Update wrangler.toml:

main = "src/worker.ts"

Deploy

# Login to Cloudflare
npx wrangler login

# Deploy
npx wrangler deploy

Your app is deployed to my-pancake-app.<your-subdomain>.workers.dev.

Add a Custom Domain (Optional)

In the Cloudflare dashboard:

  1. Go to Workers & Pages
  2. Select your worker
  3. Click Settings → Domains & Routes
  4. Add your custom domain

Cloudflare handles SSL certificates automatically.

Serving Static Assets

For React views, you need to serve the built assets. With Workers:

# wrangler.toml
[site]
bucket = "./dist/ui"

Or use Cloudflare Pages for static hosting and Workers for the API.

Environment Variables

With Tunnel

Use a .env file or export variables:

export DATABASE_URL="postgres://..."
export API_KEY="..."
node dist/index.js

With Workers

Add secrets via Wrangler:

npx wrangler secret put DATABASE_URL
npx wrangler secret put API_KEY

Or use vars in wrangler.toml for non-sensitive values:

[vars]
LOG_LEVEL = "info"

CI/CD with GitHub Actions

Automate deployments with GitHub Actions:

Tunnel Deployment

# .github/workflows/deploy.yml
name: Deploy

on:
  push:
    branches: [main]

jobs:
  deploy:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - uses: pnpm/action-setup@v4
        with:
          version: 9

      - uses: actions/setup-node@v4
        with:
          node-version: 22
          cache: 'pnpm'

      - run: pnpm install
      - run: pnpm build

      # Deploy to your server via SSH, rsync, etc.
      - name: Deploy
        run: |
          rsync -avz dist/ user@server:/app/dist/
          ssh user@server "systemctl restart my-pancake-app"

Workers Deployment

# .github/workflows/deploy.yml
name: Deploy to Workers

on:
  push:
    branches: [main]

jobs:
  deploy:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - uses: pnpm/action-setup@v4
        with:
          version: 9

      - uses: actions/setup-node@v4
        with:
          node-version: 22
          cache: 'pnpm'

      - run: pnpm install
      - run: pnpm build

      - name: Deploy to Workers
        uses: cloudflare/wrangler-action@v3
        with:
          apiToken: ${{ secrets.CLOUDFLARE_API_TOKEN }}

Create the API token in Cloudflare dashboard under My Profile → API Tokens with Edit Cloudflare Workers permission.

Connecting AI Clients

Once deployed, configure your AI clients:

Claude Desktop

{
  "mcpServers": {
    "my-app": {
      "url": "https://myapp.example.com/mcp"
    }
  }
}

ChatGPT

Use your deployed URL for the plugin manifest:

https://myapp.example.com/.well-known/openai-plugin.json

Troubleshooting

Tunnel Disconnects

If your tunnel keeps disconnecting:

  • Check your server is running and healthy
  • Verify network connectivity
  • Run cloudflared tunnel info to check status

Worker Memory Limits

If you hit memory limits:

  • Stream large responses instead of buffering
  • Reduce bundle size with tree-shaking
  • Move heavy computation to external services

CORS Issues

If AI clients can't connect due to CORS:

const app = createApp({
  // ...
  config: {
    cors: {
      origin: '*', // Or specific origins
      methods: ['GET', 'POST', 'OPTIONS'],
    },
  },
});

Next Steps

On this page