The deepest sub-tree in the 2nth technology branch. Cloudflare's developer platform moved from "run JavaScript at the edge" to a full small-cloud in roughly seven years โ compute, storage, data, messaging, AI, and networking primitives, all bound together by a single wrangler config and one declarative bindings model. This hub is the map.
Sixteen services grouped into five bands โ compute, storage, messaging & data, AI, and networking. Each card maps to a canonical leaf in tech/cloudflare/*. Workers is the only leaf with a full explainer today; the rest land in sequence.
The V8-isolate runtime that replaced a lot of origin servers. Streaming, service bindings, sub-millisecond cold starts. The first leaf to ship a full know.2nth.ai explainer.
Git-driven static hosting with Functions. The "connect a repo, get a deploy" surface that pulls Workers in at the edges. This site runs on it.
Durable execution with step.do() โ the orchestration primitive for long-running agent flows without writing your own retry logic.
Multi-tenant execution โ run customer Workers inside your Workers. The "platform-of-platforms" surface for SaaS builders.
Global, eventually consistent key-value. Sessions, feature flags, config โ the first real state primitive Workers got.
S3-compatible object storage with zero egress fees. Presigned URLs, lifecycle rules, the one storage service on this list that pays for itself against peers.
SQLite at the edge. Migrations, batch ops, single-primary semantics โ and the honest constraints that come with being SQLite rather than Postgres.
Connection pooling to your real Postgres or MySQL. The escape hatch for when D1 isn't the right shape and you need the client-side SOR.
Reliable messaging with dead-letter queues. The async AI fan-out pattern โ producer Worker drops a job, consumer Worker processes on its own schedule.
Single-writer stateful compute. WebSocket hibernation, strong consistency, the primitive behind real-time collab and game state at the edge.
Time-series metrics you can write from any Worker. The token-economy telemetry store โ cheap, append-only, queryable via SQL.
Inbound routing plus MailChannels for outbound. Transactional email that lives on the same network as the code that triggered it.
Edge inference across a catalogue of open models โ LLMs, embeddings, vision, speech. JSON schema enforcement, no GPU management.
Token metering, caching, and fallback routing for LLM calls โ a proxy layer that sits between your Worker and OpenAI, Anthropic, or Workers AI.
Vector database with metadata filtering. The RAG pipeline's retrieval half โ pairs with Workers AI embeddings for a fully edge-hosted stack.
Secure outbound connector from on-premise systems to Workers. The right answer when the system of record lives behind someone's firewall.
Sixteen services could look like sixteen different APIs. They don't, because Cloudflare made a deliberate decision: every resource is a binding, declared in wrangler.toml, and arriving on your Worker's env argument at runtime. No client libraries to keep in sync, no credentials to rotate, no URLs to hardcode.
When you need KV, D1, R2, a Durable Object, a Queue, or a Workers AI model, you declare it in config. At runtime the binding appears as a property on env, already authenticated and already pointing at the right namespace. You call methods on it; you don't construct clients.
# wrangler.toml โ one config, many primitives name = "edge-app" main = "src/index.ts" [[kv_namespaces]] binding = "CACHE" id = "..." [[d1_databases]] binding = "DB" database_name = "prod" [[r2_buckets]] binding = "ASSETS" bucket_name = "uploads" [[durable_objects.bindings]] name = "ROOM" class_name = "ChatRoom" [[queues.producers]] binding = "JOBS" queue = "ingest-jobs" [ai] binding = "AI"
The payoff: you learn this pattern once and every service in this hub fits into it. Workers calls env.CACHE.get(), env.DB.prepare(), env.AI.run(), env.JOBS.send(). Same shape every time.
Cloudflare is the runtime almost every other leaf in the 2nth tree requires. Every ERP explainer points back to Workers for the proxy; every AI flow lands on Workers AI or Vectorize; every SOR integration uses Tunnel or Hyperdrive.