5. Compute — The Three Runtimes
Vercel Functions (the unified umbrella as of mid-2025) come in two runtimes. Understanding the difference is essential for every architecture conversation.
5.1 Edge Runtime
What it is: Lightweight execution environment built on V8 isolates — the same engine that runs Chrome. Not a full Node.js environment.
How it works: Code runs in V8 isolates at CDN Points of Presence (PoPs) — physically close to users. Cold starts are up to 9× faster than traditional serverless.
API surface: Web Standard APIs only (fetch, Request, Response, URL, crypto). No file system. No native modules. No arbitrary npm packages.
Execution limits:
- CPU time: 35ms (hard limit — measured in CPU, not wall clock)
- Memory: 128MB
- Response size: 4MB
Best for:
- Authentication checks (
if (!token) redirect('/login')) - A/B testing and feature flag routing
- Geolocation-based redirects (
if (geo.country === 'DE') redirect('/de')) - Request/response header manipulation
- Rate limiting (simple, IP-based)
- Lightweight personalisation at the network edge
Not for:
- Database queries (no persistent connections)
- Long-running tasks
- Node.js-specific libraries
- Anything with PHI/PII that must stay in a specific region
5.2 Serverless Functions (Node.js Runtime)
What it is: Full Node.js environment running in ephemeral containers (microVMs). Full npm package compatibility. Full file system access.
How it works: Each function invocation spins up a container, executes, and returns. With Fluid Compute (see Section 4), multiple invocations can share a single container instance.
Execution limits (plan-dependent):
- Hobby: 60 seconds max
- Pro: 15 seconds default, configurable up to 300 seconds (5 minutes) with Fluid Compute, up to 13 minutes for some configurations
- Enterprise: custom
Best for:
- Server-side rendering (SSR pages)
- API routes with database queries
- Authentication with JWT validation
- AI/LLM inference calls
- File processing, image generation
- Anything needing full Node.js APIs
Cold start reality: Traditional serverless suffers from cold starts (100–500ms). Fluid Compute mitigates this significantly via bytecode caching and instance reuse.
Runtime Comparison — Quick Reference
| Edge Runtime | Serverless (Node.js) | |
|---|---|---|
| Engine | V8 isolates | Node.js (full) |
| Location | CDN PoPs (100+ globally) | Regional data centres (configurable) |
| Cold start | Near-zero | 100–500ms (mitigated by Fluid) |
| Max CPU | 35ms | Minutes (plan-dependent) |
| Max memory | 128MB | Up to 3GB |
| npm packages | Web API compatible only | All packages |
| File system | No | Yes (ephemeral) |
| Persistent connections | No | Yes (during invocation) |
| Database access | No (use Edge Config for reads) | Yes |