Cloudflare Workers and platform
Cloudflare Workers is our default platform for new application work where edge latency, global distribution, and operational simplicity matter. Workers run on V8 isolates with sub-millisecond cold starts, expose Web Standards APIs (Fetch, Streams, Web Crypto, WebSockets), and deploy through Wrangler with versioned environments and near-instant rollbacks. The same TypeScript code that handles HTTP requests also runs queues, scheduled cron triggers, Durable Objects for stateful coordination, and long-lived WebSocket connections.
The platform extends well beyond the compute layer. Workers AI is a managed inference layer with open-weight models (Llama, Mistral, embedding models, image generation, speech-to-text, vision) running on Cloudflare's GPU fleet, billed per request with no provisioning. D1 is Cloudflare's serverless SQLite database with read replication, migrations as code, and a binding API that calls into the database with no connection pool to manage; it fits the small-to-mid-size relational workloads where Postgres would be overkill. R2 is S3-compatible object storage with zero egress fees, which makes it the right home for static-site hosting, large file uploads, generated artifacts, and any payload that needs to leave the network for downstream consumption. Cloudflare Containers is the newer addition: standard OCI container images run in a managed runtime co-located with Workers, opening up workloads (long-running processes, language runtimes Workers can't host, headless browsers, build agents) that previously needed a separate VPS or Kubernetes cluster.
How we use the platform together: a Worker handles the API surface (routing, auth, business logic), D1 stores the relational data, R2 holds the static and binary assets, Queues smooth out async work, Durable Objects coordinate state across requests, Workers AI serves the inference where a managed open-weight model is the right choice, and Containers run the heavier workloads when a Worker isn't the right shape. The result is a single deployable platform with a single billing surface, sub-millisecond region-to-region failover, and operational ergonomics that beat traditional VPS or Kubernetes setups for the workloads that fit.
Recent work: webforcheap.com, our AI-assisted static-site platform for small businesses, runs entirely on Cloudflare's stack. Workers handle the API, dashboard, and per-site request routing; D1 stores accounts, sites, billing, and affiliate data; R2 hosts every customer site and serves traffic globally; Queues coordinate async build jobs; Durable Objects orchestrate Claude-driven build containers; Workers AI provides managed inference inside the build pipeline; and Cloudflare Containers run the heavier workloads that need a full container runtime. One platform, one bill, one deploy command per worker.