cloudflare-parallel v0.3 · live demo

4N parallel V8 isolates per Worker request.

Burn CPU across real parallel isolates on Cloudflare Workers. Hash, transform, simulate, render, evolve — work that single-threaded JavaScript can't fan out, this library can. Pick a size below and watch the topology selector decide whether to run in-DO, hybrid, or tree.

This library is for CPU work. If you're awaiting fetch / KV / AI / R2, plain Promise.all on one isolate already gives you that for free. When to use →

① Hero · pool.map over a CPU-bound function

view source

Each item: a SHA-256 chain (5,000 iterations over a 64-byte buffer). ~80–150 ms of CPU per item on workerd. Pick a fan-out size; the library picks the topology.

Fan out across
Topology
size?
Wall-clock (parallel)
ms
Sequential baseline
ms
extrapolated from N=32 sample
Speedup
×
await pool.map(sha256Chain, items)

② Topology · how the selector picks

view source

The auto-selector reads items.length and picks one of three shapes. Hover any row to see the math; click Run to fan out and update the live numbers.

Size Topology Math V8 isolates Live
4in-do1 DO × 4 loaders4
32hybrid⌈32/4⌉=8 leaves × 432
128hybrid⌈128/4⌉=32 leaves × 4128 (4N)
256treeK=2 tiers, F=84·F²
512treeK=3 tiers, F=84·F³

③ Every primitive, hands-on

view source

Each card runs against the live test worker. The code shown is the actual call being made.

④ Scheduler · reactive job queue

view source

Enqueue a CPU-bound burst (each job: 1M LCG iterations). Reactive dispatch starts work the moment a slot frees — no alarm-batched delay. Fair round-robin across tenantId.

Queued
In-flight
Completed
Failed
Cancelled
await scheduler.enqueue({ fn, args, tenantId, idempotencyKey })

⑤ Actor · pinned state across submits

view source

A counter Actor whose state lives in a Coordinator DO's SQLite. Each submit mutates the state in place; the runtime structured-clones it after each call. Close the Actor; the state is gone.

Counter
await actor.submit((state) => ++state.count)

⑥ VM · sandboxed user code over HTTP

view source

Paste a function expression. It runs in a fresh sandboxed V8 isolate with globalOutbound: null (no fetch) and zero bindings exposed. Bearer-auth required. Don't try to bypass the sandbox — it's load-bearing.

click Submit to run

⑦ Cancel · live AbortSignal

view source

Start a 1M-iteration SHA chain inside an isolate. Hit Cancel; the request closes; env.signal.aborted trips inside the loaded isolate; the loop returns early. Watch the iteration counter stop.

idle
const cancel = new CancelToken();
await pool.submit(longLoop, iters, { cancel });
// hit /demo/cancel-start; close the SSE to fire cancel.

⑧ Bench · live edge measurements

bench-results-live.json

The honest curve. Each row: SHA-256-chain over N items, parallel via pool.map vs sequential baseline. Speedup grows with size as the dispatch overhead amortizes; tree topology kicks in at 256.

Size Topology Sequential Parallel Speedup Run live