Skip to main content
Edge Computing Cloudflare Workers (2026)

Edge Computing with Cloudflare Workers: Bringing Code Closer to Users

The premise of edge computing is simple: run your code where your users are instead of where your server is. If your origin server sits in a single data centre in Virginia and your user is in Tokyo, every request travels 14,000 kilometres each way. At the speed of light in fibre, that is roughly 70ms of pure physics-imposed latency before your server even starts processing the request. Multiply by the number of round trips a typical page load requires, and you understand why geography matters more than most performance optimisations.

Cloudflare Workers execute JavaScript (and WebAssembly) across Cloudflare's network of 300+ data centres worldwide. Your code runs within milliseconds of the user, eliminates the origin round trip for logic that can be resolved at the edge, and integrates with Cloudflare's storage primitives (KV, R2, D1, Durable Objects) for stateful applications. Having run Workers in production for this site and several others, the platform has reached a maturity level where it is no longer experimental — it is infrastructure.

This page covers practical edge computing patterns with Workers, the performance characteristics observed in production, and the architectural decisions that determine whether edge deployment improves or complicates your application. The page is part of the web development section and connects to the web performance topic hub.


What Workers actually are

A Cloudflare Worker is a V8 isolate — not a container, not a virtual machine. Each request spins up a lightweight JavaScript execution context that shares no state with other requests. The cold start time is under 5ms, compared to 100–500ms for traditional serverless functions (Lambda, Cloud Functions). This matters because cold starts are not rare events on edge platforms — with 300+ deployment locations, each location may see lower traffic density and more frequent cold starts.

Workers use the Service Worker API, which means they intercept HTTP requests and can modify requests, generate responses, or forward to an origin server. If you have written a service worker for a PWA, the programming model is familiar.

export default {
async fetch(request, env) {
const url = new URL(request.url);

// Handle at the edge
if (url.pathname === '/api/geo') {
return new Response(JSON.stringify({
country: request.cf.country,
city: request.cf.city,
colo: request.cf.colo,
}), { headers: { 'Content-Type': 'application/json' } });
}

// Forward everything else to origin
return fetch(request);
}
};

Practical use cases

Request routing and A/B testing

Edge-based routing eliminates the origin round trip for traffic splitting decisions. Instead of every request hitting your origin to determine which variant to serve, the Worker makes the decision at the edge and either serves from cache or forwards to the appropriate backend. For A/B tests with cached variants, the response time approaches CDN cache hit speed.

Authentication and access control

Validating JWTs, checking API keys, or enforcing geographic restrictions at the edge means rejected requests never reach your origin. This reduces origin load and improves response time for both accepted and rejected requests. The cryptographic primitives available in Workers (Web Crypto API) handle JWT verification efficiently.

Dynamic content assembly

For pages that combine static content with personalised elements (user name, locale-specific pricing, feature flags), Workers can assemble the response at the edge by combining cached HTML with dynamic fragments. Cloudflare's HTMLRewriter API parses and modifies HTML streaming, enabling edge-side includes without the complexity of ESI or the latency of origin-side assembly.

API response transformation

Workers sitting between a client and a backend API can transform responses — reshaping JSON, filtering fields, adding CORS headers, converting formats — without modifying the backend. This is particularly useful when you cannot modify the origin API but need the response in a different shape for your frontend.


Performance observations

Running Workers in production for request routing and authentication, the observed performance characteristics:

  • Cold start: Consistently under 5ms, often under 1ms. Unnoticeable to users.
  • Execution time: Simple routing logic completes in 0.5–2ms. JWT validation adds 1–3ms. HTMLRewriter processing adds 2–10ms depending on document size.
  • Total response time for edge-resolved requests: 10–30ms from the user's perspective, compared to 100–300ms for origin-resolved requests.

The improvement is most dramatic for users far from the origin. A user in Sydney accessing an origin in Europe sees 300ms+ for origin requests but 20ms for edge-resolved requests. The closer the user is to the origin, the smaller the delta.


Storage at the edge

Workers alone are stateless, but Cloudflare's storage primitives enable stateful edge applications:

KV (Key-Value): Eventually consistent, globally replicated. Good for configuration, feature flags, and cached data where strong consistency is not required. Read latency is under 10ms at the edge.

R2 (Object Storage): S3-compatible object storage with no egress fees. Good for serving static assets, user uploads, and large datasets from the edge.

D1 (SQL Database): SQLite-based relational database at the edge. Good for applications that need relational queries without an origin database round trip. Still maturing in 2026 but usable for read-heavy workloads.

Durable Objects: Strongly consistent, single-instance coordination primitives. Good for real-time collaboration, rate limiting, and any use case requiring exactly-once semantics.


When edge computing does not help

If your application is a monolithic server that performs complex database queries, machine learning inference, or heavy computation for every request, moving logic to the edge does not eliminate the origin dependency — it just adds an extra hop. Edge computing improves performance when the edge can resolve requests independently or when the computation at the edge is simpler than the origin processing it replaces.

For content-heavy sites, static site generators (like Docusaurus, which powers this site), and API-light applications, edge computing provides the largest benefit. For complex transactional applications, the edge is best used for authentication, routing, and caching — with the heavy logic remaining at the origin.

The economics also matter. Workers pricing is per-request and per-millisecond of CPU time. For simple routing and caching logic, the cost is negligible. For CPU-intensive edge processing at high traffic, the cost can exceed a traditional server.

Field Dispatch

Stay sharp.

New guides and security notes — straight to your inbox when they drop. No noise. No sponsorships. Just the signal.

  • Security & privacy investigations
  • Linux, WSL, and server notes
  • Web performance and dev tooling

Free. No spam. Unsubscribe anytime.